Chevron Left
返回到 Natural Language Processing with Attention Models

学生对 提供的 Natural Language Processing with Attention Models 的评价和反馈

711 个评分
175 条评论


In Course 4 of the Natural Language Processing Specialization, you will: a) Translate complete English sentences into German using an encoder-decoder attention model, b) Build a Transformer model to summarize text, c) Use T5 and BERT models to perform question-answering, and d) Build a chatbot using a Reformer model. By the end of this Specialization, you will have designed NLP applications that perform question-answering and sentiment analysis, created tools to translate languages and summarize text, and even built a chatbot! Learners should have a working knowledge of machine learning, intermediate Python including experience with a deep learning framework (e.g., TensorFlow, Keras), as well as proficiency in calculus, linear algebra, and statistics. Please make sure that you’ve completed course 3 - Natural Language Processing with Sequence Models - before starting this course. This Specialization is designed and taught by two experts in NLP, machine learning, and deep learning. Younes Bensouda Mourri is an Instructor of AI at Stanford University who also helped build the Deep Learning Specialization. Łukasz Kaiser is a Staff Research Scientist at Google Brain and the co-author of Tensorflow, the Tensor2Tensor and Trax libraries, and the Transformer paper....


Oct 4, 2020

Can the instructors make maybe a video explaining the ungraded lab? That will be useful. Other students find it difficult to understand both LSH attention layer ungraded lab. Thanks

Jun 22, 2021

This course is briliant which talks about SOTA models such as Transformer, BERT. It would be better to have a Capstone Project. And entire projects can be downloaded easily.


51 - Natural Language Processing with Attention Models 的 75 个评论(共 175 个)

创建者 Adrien B

Apr 12, 2021

Quite complete course on the (current) state of the art of NLP. Interesting assigment using Trax. Some good introduction about big NLP model like GPT/T5/BERT.

创建者 Aleksander M

Oct 13, 2020

Great course! I really enjoyed extensive non-graded notebooks on LSH attention. Some content was pretty challenging, but always very rewarding!

Thank you!

创建者 Фирсанова В И

Apr 26, 2021

The course was very informative, I gained several new skills and find a lot of new, and now I feel myself much more confident in NLP, thank you!

创建者 西川 尚之

Oct 18, 2020

Thanks Lukasz Kaiser and Younes Mourri ,Trax and coursera team!

This NLP specialization is most imporrtant and useful i've ever been before!

创建者 satish b

Jan 1, 2021

One of the best course I have ever taken. The course provides in-depth learning of transformers from the creators of Transformers.

创建者 Nicolas F V D

Sep 23, 2021

It's a great way to get started with state-of-the-art NLP techniques, following the recommended papers is extremely useful.

创建者 Dmitri

Dec 20, 2020

Great course! I understood a lot of things and got a valuable experience working with the state-of-the-art architectures 👍

创建者 Umberto S

Apr 17, 2021

Really practical course. It seems the SOTA in NLP, touching Transformers, BERT, T5, Reformers. So I think it's worth it

创建者 Ovidio M

Apr 17, 2021

This is a very recommendable course to understand state-of-the-art NLP techniques and models using neural networks.

创建者 Shahin Z

Oct 27, 2020

Everything was great.

Slides & notebooks/exercise were amazing

The content is superb and very up-to-date.

创建者 Ruiliang L

May 31, 2021

The course is good. If we download powerpoint and files in jupyter notebook, that will be great.

创建者 Ajay G

Jul 21, 2021

Nice course to get the details of Attention with latest state of the art deep learning models.

创建者 Martin P

Mar 22, 2021

Great course with great lecturers. Lecturers have clearly showed how far NLP research is.

创建者 Syed M F R

Oct 11, 2020

Loved the last week of the course, stood out amongst the other 15 of the specialization.

创建者 Bhupi D

Oct 19, 2020

Critical to keep abreast of state of art models in NLP and new frameworks like Trax.

创建者 Kam K

Aug 18, 2021

I liked the BERT sections and references to the theory behind positional encoders

创建者 Yun-Chen L

Nov 18, 2020

It's good course, you can learn a lot of model and basic concept like attention.

创建者 Björn R

Oct 17, 2020

This course made the latest technology of NLP easy to understand and implement.

创建者 Madhur G

Oct 13, 2020

It is great. It helps us to learn and implement the latest NLP architectures.

创建者 Op S

Dec 31, 2020

One of the most comprehensive courses in NLP with challenging quizzes around

创建者 Bastiaen v d R

Oct 20, 2020

Incredibly interesting course showing state-of-the-art language modelling

创建者 aydinakgokalp

Jan 31, 2021

Helped me catching up latest technologies. Gave me new ideas as well.

创建者 Carlos O

Oct 29, 2020

It's a great and challenging way to learn about these SOA frameworks.

创建者 Utku T

Feb 24, 2021

Awesome course! I learned many theoretical and practical techniques.

创建者 Arunjith

Sep 15, 2021

in depth understanding of the transformers. Highly recommended