Loading...

Attention mechanism

Nearly any task in NLP can be formulates as a sequence to sequence task: machine translation, summarization, question answering, and many more. In this module we will learn a general encoder-decoder-attention architecture that can be used to solve them. We will cover machine translation in more details and you will see how attention technique resembles word alignment task in traditional pipeline.

国立高等经济大学
4.6(366 个评分) | 35K 名学生已注册
课程 6(共 7 门,高级机器学习 专项课程

关于 Coursera

课程、专项课程和在线学位均由全世界一流大学和教育机构的顶尖授课教师教授。

Community
Join a community of 40 million learners from around the world
Certificate
Earn a skill-based course certificate to apply your knowledge
Career
Gain confidence in your skills and further your career