Loading...

Why words? From character to sentence embeddings

This module is devoted to a higher abstraction for texts: we will learn vectors that represent meanings. First, we will discuss traditional models of distributional semantics. They are based on a very intuitive idea: "you shall know the word by the company it keeps". Second, we will cover modern tools for word and sentence embeddings, such as word2vec, FastText, StarSpace, etc. Finally, we will discuss how to embed the whole documents with topic models and how these models can be used for search and data exploration.

国立高等经济大学
4.6 (342 Ratings) | 33K Students Enrolled
课程 6(共 7 门,高级机器学习 专项课程

推荐视频

关于 Coursera

Courses, Specializations, and Online Degrees taught by top instructors from the world's best universities and educational institutions.

Community
Join a community of 40 million learners from around the world
Certificate
Earn a skill-based course certificate to apply your knowledge
Career
Gain confidence in your skills and further your career