Traditional Language models

video-placeholder
Loading...
查看授课大纲

您将学习的技能

Word Embedding, Sentiment with Neural Nets, Siamese Networks, Natural Language Generation, Named-Entity Recognition

审阅

4.5(731 个评分)
  • 5 stars
    72.50%
  • 4 stars
    16.41%
  • 3 stars
    5.74%
  • 2 stars
    2.59%
  • 1 star
    2.73%
KT
Sep 24, 2020

The lectures are well planned--very short and to the point. The labs offer immense opportunity for practice, and assignment notebooks are well-written! Overall, the course is fantastic!

CR
Mar 20, 2021

I wish the neural networks would be described in greater detail.\n\nEverything else is really nice, Younes explains very well. Assignments are very nicely prepared.

从本节课中
Recurrent Neural Networks for Language Modeling
Learn about the limitations of traditional language models and see how RNNs and GRUs use sequential data for text prediction. Then build your own next-word generator using a simple RNN on Shakespeare text data!

教学方

  • Placeholder

    Younes Bensouda Mourri

    Instructor
  • Placeholder

    Łukasz Kaiser

    Instructor
  • Placeholder

    Eddy Shyu

    Senior Curriculum Developer

探索我们的目录

免费加入并获得个性化推荐、更新和优惠。