Seq2seq

Loading...
查看授课大纲

您将学习的技能

Reformer Models, Neural Machine Translation, Chatterbot, T5+BERT Models, Attention Models

审阅

4.3(30 个评分)
  • 5 stars
    80%
  • 4 stars
    6.66%
  • 2 stars
    3.33%
  • 1 star
    10%
从本节课中
Neural Machine Translation
Discover some of the shortcomings of a traditional seq2seq model and how to solve for them by adding an attention mechanism, then build a Neural Machine Translation model with Attention that translates English sentences into German.

教学方

  • Placeholder

    Younes Bensouda Mourri

    Course Instructor
  • Placeholder

    Łukasz Kaiser

    Course Instructor
  • Placeholder

    Eddy Shyu

    Senior Curriculum Developer

探索我们的目录

免费加入并获得个性化推荐、更新和优惠。