This course provides a practical introduction to using transformer-based models for natural language processing (NLP) applications. You will learn to build and train models for text classification using encoder-based architectures like Bidirectional Encoder Representations from Transformers (BERT), and explore core concepts such as positional encoding, word embeddings, and attention mechanisms.



Generative AI Language Modeling with Transformers
This course is part of multiple programs.



Instructors: Joseph Santarcangelo
10,528 already enrolled
Included with
(81 reviews)
Recommended experience
What you'll learn
Explain the role of attention mechanisms in transformer models for capturing contextual relationships in text
Describe the differences in language modeling approaches between decoder-based models like GPT and encoder-based models like BERT
Implement key components of transformer models, including positional encoding, attention mechanisms, and masking, using PyTorch
Apply transformer-based models for real-world NLP tasks, such as text classification and language translation, using PyTorch and Hugging Face tools
Skills you'll gain
Details to know

Add to your LinkedIn profile
6 assignments
See how employees at top companies are mastering in-demand skills

Build your subject-matter expertise
- Learn new concepts from industry experts
- Gain a foundational understanding of a subject or tool
- Develop job-relevant skills with hands-on projects
- Earn a shareable career certificate

There are 2 modules in this course
In this module, you will learn the techniques to achieve positional encoding and how to implement positional encoding in PyTorch. You will learn how attention mechanism works and how to apply attention mechanism to word embeddings and sequences. You will also learn how self-attention mechanisms help in simple language modeling to predict the token. In addition, you will learn about scaled dot-product attention mechanism with multiple heads and how the transformer architecture enhances the efficiency of attention mechanisms. You will also learn how to implement a series of encoder layer instances in PyTorch. Finally, you will learn how to use transformer-based models for text classification, including creating the text pipeline and the model and training the model.
What's included
6 videos4 readings2 assignments2 app items1 plugin
In this module, you will learn about decoders and GPT-like models for language translation, train the models, and implement them using PyTorch. You will also gain knowledge about encoder models with Bidirectional Encoder Representations from Transformers (BERT) and pretrain them using masked language modeling (MLM) and next sentence prediction (NSP). You will also perform data preparation for BERT using PyTorch. Finally, you learn about the applications of transformers for translation by understanding the transformer architecture and performing its PyTorch Implementation. The hands-on labs in this module will give you good practice in how you can use the decoder model, encoder model, and transformers for real-world applications.
What's included
10 videos6 readings4 assignments4 app items2 plugins
Earn a career certificate
Add this credential to your LinkedIn profile, resume, or CV. Share it on social media and in your performance review.
Offered by
Explore more from Machine Learning
DeepLearning.AI
Google Cloud
- Status: Free Trial
- Status: Free Trial
DeepLearning.AI
Why people choose 糖心vlog官网观看 for their career




Learner reviews
81 reviews
- 5 stars
76.82%
- 4 stars
12.19%
- 3 stars
3.65%
- 2 stars
1.21%
- 1 star
6.09%
Showing 3 of 81
Reviewed on Oct 11, 2024
Once again, great content and not that great documentation (printable cheatsheets, no slides, etc). Documentation is essential to review a course content in the future. Alas!
Reviewed on Nov 17, 2024
need assistance from humans, which seems lacking though a coach can give guidance but not to the extent of human touch.
Reviewed on Jan 18, 2025
Exceptional course and all the labs are industry related

Open new doors with 糖心vlog官网观看 Plus
Unlimited access to 10,000+ world-class courses, hands-on projects, and job-ready certificate programs - all included in your subscription
Advance your career with an online degree
Earn a degree from world-class universities - 100% online
Join over 3,400 global companies that choose 糖心vlog官网观看 for Business
Upskill your employees to excel in the digital economy
Frequently asked questions
It will take only two weeks to complete this course if you spend 3鈥5 hours of study time per week.
It would be good if you had a basic knowledge of Python and a familiarity with machine learning and neural network concepts. It would be beneficial if you are familiar with text preprocessing steps and N-gram, Word2Vec, and sequence-to-sequence models. Knowledge of evaluation metrics such as bilingual evaluation understudy (BLEU) will be advantageous.
This course is part of the Generative AI Engineering Essentials with LLMs PC specialization. When you complete the specialization, you will prepare yourself with the skills and confidence to take on jobs such as AI Engineer, NLP Engineer, Machine Learning Engineer, Deep Learning Engineer, and Data Scientist.
More questions
Financial aid available,