Transformers for Generative AI Language Models
Issued by
IBM
The badge earner has demonstrated foundational knowledge of transformer-based models for natural language processing (NLP). They can apply techniques to achieve positional encoding in PyTorch. They explored principles, techniques, and applications of large language models (LLMs). They gained proficiency in decoder-based models, such as generative pretrained transformer (GPT), and encoder-based models, such as bidirectional encoder representation from transformer (BERT) for language translation.
- Type Validation
- Level Intermediate
- Time Days
- Cost Paid