Playing Time |
01h:08m:48s |
Description |
data file rda |
Note |
9/30/202212:00:00AM |
Summary |
Explore a user-friendly approach to working with transformers and large language models for natural language processing. |
Cast |
Presenter: Jonathan Fernandes |
Summary |
Transformers have quickly become the go-to architecture for natural language processing (NLP). As a result, knowing how to use them is now a business-critical skill in your AI toolbox. In this course, instructor Jonathan Fernandes walks you through many of the key large language models developed since GPT-3. He presents a high-level overview of GLaM, Megatron-Turing NLG, Gopher, Chinchilla, PaLM, OPT, and BLOOM, relaying some of the most important insights from each model. Get a high-level overview of large language models, where and how they are used in production, and why they are so important to NLP. Additionally, discover the basics of transfer learning and transformer training to optimize your AI models as you go. By the end of this course, you'll be up to speed with what's happened since OpenAI first released GPT-3 as well as the key contributions of each of these large language models. |
System Details |
Latest version of the following browsers: Chrome, Safari, Firefox, or Internet Explorer. Adobe Flash Player Plugin. JavaScript and cookies must be enabled. A broadband Internet connection. |
Genre/Form |
Instructional films.
|
|
Educational films.
|
Added Author |
linkedin.com (Firm)
|
|