Deep Learning NLP: Training GPT-2 from scratch

3.9
63 个评分
提供方
Coursera Project Network
3,802 人已注册
在此指导 项目中,您将:

Understand the history of GPT-2 and Transformer Architecture Basics

Learn the requirements for a custom training set

Learn how to use functions available in public repositories to fine-tune or train GPT-2 on custom data and generate text

Clock2 hours
Beginner初级
Cloud无需下载
Video分屏视频
Comment Dots英语(English)
Laptop仅限桌面

In this 1-hour long project-based course, we will explore Transformer-based Natural Language Processing. Specifically, we will be taking a look at re-training or fine-tuning GPT-2, which is an NLP machine learning model based on the Transformer architecture. We will cover the history of GPT-2 and it's development, cover basics about the Transformer architecture, learn what type of training data to use and how to collect it, and finally, perform the fine tuning process. In the final task, we will discuss use cases and what the future holds for Transformer-based NLP. I would encourage learners to do further research and experimentation with the GPT-2 model, as well as other NLP models! Note: This course works best for learners who are based in the North America region. We’re currently working on providing the same experience in other regions.

您要培养的技能

Language ModelNatural Language ProcessingApplied Machine LearningArtificial Intelligence (AI)Tensorflow

分步进行学习

在与您的工作区一起在分屏中播放的视频中,您的授课教师将指导您完成每个步骤:

  1. Introducing GPT-2

  2. Intro to Transformers

  3. Gathering a Dataset

  4. Training and Fine Tuning our Model

  5. Use Cases and the Future

指导项目工作原理

您的工作空间就是浏览器中的云桌面,无需下载

在分屏视频中,您的授课教师会为您提供分步指导

审阅

来自DEEP LEARNING NLP: TRAINING GPT-2 FROM SCRATCH的热门评论

查看所有评论

常见问题

常见问题

还有其他问题吗?请访问 学生帮助中心