Perfil de habilidad

Transformers & NLP

HuggingFace Transformers, BERT, GPT, fine-tuning, tokenization

Machine Learning & AI Natural Language Processing

Roles

3

donde aparece esta habilidad

Niveles

5

ruta de crecimiento estructurada

Requisitos obligatorios

11

los otros 4 opcionales

Dominio

Machine Learning & AI

skills.group

Natural Language Processing

Última actualización

17/3/2026

Cómo usar

Selecciona tu nivel actual y compara las expectativas.

Qué se espera en cada nivel

La tabla muestra cómo crece la profundidad desde Junior hasta Principal.

Rol Obligatorio Descripción
Data Scientist Understands core Transformer architecture (attention mechanisms, positional encoding) and basic NLP pipelines. Uses pre-trained models from Hugging Face for text classification and NER tasks. Follows team guidelines for data preprocessing and tokenization workflows.
LLM Engineer Understands core Transformer architecture and attention mechanisms. Works with pre-trained LLMs using Hugging Face Transformers library for inference and basic prompt engineering. Follows established patterns for model loading, tokenization, and API integration.
NLP Engineer Obligatorio Knows transformer architecture basics for NLP: self-attention, positional encoding, BERT/GPT. Uses Hugging Face transformers for inference: text classification, NER, summarization.
Rol Obligatorio Descripción
Data Scientist Fine-tunes Transformer models for domain-specific NLP tasks including sentiment analysis, summarization, and question answering. Evaluates trade-offs between model size, inference speed, and accuracy. Implements custom training loops with proper evaluation metrics and cross-validation.
LLM Engineer Applies fine-tuning techniques (LoRA, QLoRA, PEFT) to adapt LLMs for specific tasks. Implements RAG pipelines combining retrieval with generation. Understands trade-offs between model quantization, context window size, and output quality for production use cases.
NLP Engineer Obligatorio Independently fine-tunes transformer models for NLP: BERT, RoBERTa, T5 for domain-specific tasks. Configures tokenizers, training arguments, evaluation metrics via Hugging Face Trainer.
Rol Obligatorio Descripción
Data Scientist Obligatorio Designs end-to-end NLP systems with Transformer models for production: custom architectures, distributed training, model distillation and quantization. Optimizes inference latency and throughput for real-time applications. Mentors team on advanced techniques like multi-task learning and domain adaptation.
LLM Engineer Obligatorio Architects production LLM systems with optimized serving (vLLM, TGI), model parallelism, and efficient batching strategies. Designs evaluation frameworks for model quality, safety, and bias detection. Implements advanced techniques: RLHF, constitutional AI, and chain-of-thought optimization.
NLP Engineer Obligatorio Designs advanced NLP solutions with transformers: adapter-based fine-tuning, model merging, efficient inference. Optimizes through quantization, pruning, Flash Attention for production.
Rol Obligatorio Descripción
Data Scientist Obligatorio Defines NLP and Transformer strategy at team level: selects model architectures, establishes training infrastructure, and sets quality benchmarks. Conducts architectural reviews of ML pipelines. Drives adoption of best practices for experiment tracking, model versioning, and reproducibility.
LLM Engineer Obligatorio Defines LLM platform strategy: model selection criteria, fine-tuning pipelines, and serving infrastructure standards. Establishes evaluation benchmarks, safety guardrails, and cost optimization practices. Reviews architectural decisions for RAG systems, agent frameworks, and multi-model orchestration.
NLP Engineer Obligatorio Defines transformer strategy for the NLP team. Establishes model selection guidelines, fine-tuning approaches, and optimization techniques. Evaluates new architectures and their applicability.
Rol Obligatorio Descripción
Data Scientist Obligatorio Defines organizational NLP and Transformer strategy: foundation model investments, build-vs-buy decisions for language AI, and cross-team knowledge sharing. Establishes enterprise standards for model governance, responsible AI, and data pipeline architecture. Mentors leads on scaling ML organizations.
LLM Engineer Obligatorio Shapes organizational LLM strategy: evaluates foundation model providers, defines enterprise AI governance policies, and architects cross-team LLM infrastructure. Establishes standards for model risk management, cost forecasting, and responsible AI deployment. Mentors leads on building scalable AI platforms.
NLP Engineer Obligatorio Shapes enterprise transformer strategy for the NLP platform. Defines model hub standards, shared fine-tuned models, and research-to-production workflow at organizational level.

Comunidad

👁 Seguir ✏️ Sugerir cambio Inicia sesión para sugerir cambios
📋 Propuestas
Aún no hay propuestas para Transformers & NLP
Cargando comentarios...