The N-gram Language Model

video-placeholder
Loading...
查看授课大纲

您将学习的技能

Word2vec, Parts-of-Speech Tagging, N-gram Language Models, Autocorrect

审阅

4.7(1,262 个评分)

  • 5 stars
    80.98%
  • 4 stars
    13.78%
  • 3 stars
    3.72%
  • 2 stars
    0.55%
  • 1 star
    0.95%

SR

Aug 4, 2021

Filled StarFilled StarFilled StarFilled StarFilled Star

Another great course introducing the probabilistic modelling concepts and slowly getting to the direction of computing neural networks. One must learn in detail how embedding works.

SK

Jul 13, 2020

Filled StarFilled StarFilled StarFilled StarFilled Star

I have a wonderful experience. Try not to look at the hints, resolve yourself, it is excellent course for getting the in depth knowledge of how the black boxes work. Happy learning.

从本节课中

Autocomplete and Language Models

Learn about how N-gram language models work by calculating sequence probabilities, then build your own autocomplete language model using a text corpus from Twitter!

教学方

  • Placeholder

    Younes Bensouda Mourri

    Instructor

  • Placeholder

    Łukasz Kaiser

    Instructor

  • Placeholder

    Eddy Shyu

    Senior Curriculum Developer

探索我们的目录

免费加入并获得个性化推荐、更新和优惠。