Chevron Left
返回到 Natural Language Processing with Attention Models

学生对 提供的 Natural Language Processing with Attention Models 的评价和反馈

707 个评分
174 条评论


In Course 4 of the Natural Language Processing Specialization, you will: a) Translate complete English sentences into German using an encoder-decoder attention model, b) Build a Transformer model to summarize text, c) Use T5 and BERT models to perform question-answering, and d) Build a chatbot using a Reformer model. By the end of this Specialization, you will have designed NLP applications that perform question-answering and sentiment analysis, created tools to translate languages and summarize text, and even built a chatbot! Learners should have a working knowledge of machine learning, intermediate Python including experience with a deep learning framework (e.g., TensorFlow, Keras), as well as proficiency in calculus, linear algebra, and statistics. Please make sure that you’ve completed course 3 - Natural Language Processing with Sequence Models - before starting this course. This Specialization is designed and taught by two experts in NLP, machine learning, and deep learning. Younes Bensouda Mourri is an Instructor of AI at Stanford University who also helped build the Deep Learning Specialization. Łukasz Kaiser is a Staff Research Scientist at Google Brain and the co-author of Tensorflow, the Tensor2Tensor and Trax libraries, and the Transformer paper....


Oct 4, 2020

Can the instructors make maybe a video explaining the ungraded lab? That will be useful. Other students find it difficult to understand both LSH attention layer ungraded lab. Thanks

Jun 22, 2021

This course is briliant which talks about SOTA models such as Transformer, BERT. It would be better to have a Capstone Project. And entire projects can be downloaded easily.


151 - Natural Language Processing with Attention Models 的 174 个评论(共 174 个)

创建者 Maury S

Mar 13, 2021

Another less than impressive effort in a specialization from which I expected more.

创建者 martin k

Apr 26, 2021

Low quality programming assignments, but considering the price it's good overall

创建者 Prithviraj J

Dec 21, 2020

Explanations of attention/self-attention & other complex topics are too shallow

创建者 Anurag S

Jan 3, 2021

Course content more detailed explanation to follow.

创建者 Yue W G

May 24, 2021

The content is good because it covers many aspects of NLP. There are a lot of illustrations provided to help students understand the materials. However, the assignments are too easy because of the detailed comments provided. This makes it too easy because students could simply copy and paste the answers from the comments.

One suggestion is to improve the explanation of the materials because there re lots of details being skipped by the instructors. Personally, I would have to read other blogs in order to understand some of the details. Furthermore, separating the solutions from the codes is definitely something that must be done for instance presenting the solution in a separate notebook.

创建者 Randall K

Jun 14, 2021

In the previous 3 courses, the HW was a natural extention of the lectures and provided solid reinforcment of the course material. However, in this course, I found the courses did not prepare me for the HW. Furthermore, I found the lectures too terse, often incoherent, and the homework tried to introduce new concepts that were not discussed in the lectures. Also, the code in the labs was poorly organized and the lack of a consistent and coherent style between assignments and even previous courses, which made it difficult to follow the logic. I often spent a lot of time sorting out tensor indexing issues, which is very difficult in Jupyter without a debugger.

创建者 Chenjie Y

Nov 18, 2020

I think the last course is a bit rush... Many concepts are not natural and cannot be explained by one or two sentences. Comparing to the previous courses in the specialisation which really explains concepts and intuitions in detail, this last course is a bit too rough. I would rather spend another month to study the materials in two courses, instead of staying up late to read papers and blogs to understand what was not explained clearly in the course. And also, i see that trax is a good library but i think up to now it is not yet mature, and i really wish all the assignments can have tensorflow versions and let the students to choose.

创建者 Vitalii S

Jan 25, 2021

1) Information 3 out of 5:

no in depth explanations.

2) quiz are too easy, and I was missing good quizzes that were proposed at DL specialization with use cases, they cause me to think what to pick.

3) home tasks are 1 out of 5:

3.1 First of all all home tasks are done in different manner.

3.2 Some of them require additional check even all tests were passed.

3.3 Part with google collab is also a little bit strange... I want to have 1 click away home task and not setting up 3-rd party env.

What is good: for high - level overview this course is ok. Maybe have 2 versions of the course one with in depth explanations. and one more like this one.

创建者 Greg D

Dec 31, 2020

Even though this is better than the other 3 courses in the specialization it's not really any different from reading a few posts on popular machine learning blogs about the technologies they present here. I would understand if the instructors brought some insights, but it's largely just repeating what they have in the slide which in turn is just bare minimum about how to make these concepts work (which again can be found through papers + free resources).

Overall, I would recommend against taking this course since there are better or equal materials available.

创建者 Arun

Feb 18, 2021

Compared to Andrew Ng's deep learning specialization, this course requires a lot of improvement. Very often disparate facts are put together with not much connection between the ideas. This is probably because of the enormous amount of content covered. It might make sense to split the course into two. Thank you!

创建者 George G

Dec 6, 2020

Week 1 jumps into material that is better explained in Week 2. Attention deserves a more gradual and a more deep explanation. Weeks 3 and 4 cover a lot of ground, without going into depth.

创建者 Steven N

Apr 29, 2021

The course lectures were very confusing, and the course assignments were too easy, so they didn't reinforce the lecture concepts in the same way that assignments from other courses had.

创建者 Gary L

Oct 20, 2020

Disappointed. Course 4 is much more difficult to follow than other courses in this NLP specification plus other course.

创建者 Omar H

Jan 16, 2021

The Course topics are great but it could be much better by explaning the topics with much more details and providing more examples.

创建者 Mohsen A F

Oct 24, 2020

Like: State-of-the-art NLP problems to be used in the industry.

Dislike: Topics were not well-explained. Difficult to grasp

创建者 George L

Apr 11, 2021

Younes is a bad teacher. He may have good technical chops, but teaching is a different skill altogether. Overall, the NLP specialization design is much much worse compared with the DL Specialization. On one hand, you were taught a lot of stuff that are deep but cursory, on the other hand, the excises are either too difficult for you to get any clue or most of the time actually too simple and you only need to enter simple parameters, therefore cannot really learn anything! I really don't know why there are so many people giving 5 star rating!

创建者 Sharad C R

Apr 25, 2021

Probably one of the worst courses I have ever taken. By week3 I am completely lost and cant make head or tail of the content. Fine tuning the model ? My foot! The videos seemed mug up and the colab material doesnt work, I cant see any fine tuning at all.

All in all horrible course which fails on all fronts of learning.

This has been a glorius level of time sink. I will issue a chargeback from my credit card .

创建者 Dimitry I

Apr 17, 2021

Material coverage is very superficial. Do not expect to fully understand or be able to work with Attention models after doing this course.

Sadly, these types of courses and their fake near 5-star reviews are destroying Coursera.

创建者 David M

Feb 22, 2021

Unfortunately, the classes are given at a very primitive level without explaining what exactly Attention models do. The programming exercises were not explained well, either

创建者 Rajeev R

Apr 26, 2021

Course content was not educational and the assignments only evaluated python skills rather than DL knowledge.

创建者 Vadim P

Dec 26, 2021

As much as Andrew's AI course is good, that much this one is bad. Poorly presented useless material.

创建者 Weizhi D

May 31, 2021

This course sucks. The instructor cannot express concept clearly. Don't take this course.

创建者 Ignacio d l S

Jan 8, 2022

Too easy. I can say I almost didn't learn anything.