Chevron Left
返回到 Natural Language Processing with Attention Models

学生对 deeplearning.ai 提供的 Natural Language Processing with Attention Models 的评价和反馈

4.3
683 个评分
169 条评论

课程概述

In Course 4 of the Natural Language Processing Specialization, you will: a) Translate complete English sentences into German using an encoder-decoder attention model, b) Build a Transformer model to summarize text, c) Use T5 and BERT models to perform question-answering, and d) Build a chatbot using a Reformer model. By the end of this Specialization, you will have designed NLP applications that perform question-answering and sentiment analysis, created tools to translate languages and summarize text, and even built a chatbot! Learners should have a working knowledge of machine learning, intermediate Python including experience with a deep learning framework (e.g., TensorFlow, Keras), as well as proficiency in calculus, linear algebra, and statistics. Please make sure that you’ve completed course 3 - Natural Language Processing with Sequence Models - before starting this course. This Specialization is designed and taught by two experts in NLP, machine learning, and deep learning. Younes Bensouda Mourri is an Instructor of AI at Stanford University who also helped build the Deep Learning Specialization. Łukasz Kaiser is a Staff Research Scientist at Google Brain and the co-author of Tensorflow, the Tensor2Tensor and Trax libraries, and the Transformer paper....

热门审阅

JH
Oct 4, 2020

Can the instructors make maybe a video explaining the ungraded lab? That will be useful. Other students find it difficult to understand both LSH attention layer ungraded lab. Thanks

LL
Jun 22, 2021

This course is briliant which talks about SOTA models such as Transformer, BERT. It would be better to have a Capstone Project. And entire projects can be downloaded easily.

筛选依据:

126 - Natural Language Processing with Attention Models 的 150 个评论(共 169 个)

创建者 RAHUL J

Sep 29, 2020

Not up to expectations. Needs more explanation on some topics. Some were difficult to understand, examples might have helped!!

创建者 Vaseekaran V

Sep 20, 2021

I​t's a really good course to learn and get introduced on the attention models in NLP.

创建者 David M

Oct 25, 2020

An amazing experience throughout the state-of-art NLP models

创建者 Shaojuan L

Dec 18, 2020

The programming assignment is too simple

创建者 Fatih T

Feb 4, 2021

great explanation of the topic I guess!

创建者 Sreang R

Dec 22, 2020

Awesome course

创建者 Amit J

Jan 2, 2021

Though the content is extremely good and cutting edge, the course presentation/instructor hasn't been able to do justice to the course. [1] Teaching concepts through assignments (and not covering them in detail in lectures is) an absolutely bad idea. [2] Lecture instructions are ambiguous and immature at times. Instructor is an excellent engineer but a bad teacher is very evident from the way of presentation. [4] Only if input output dimensions were mentioned at every boundary in network illustrations, would have made a lot of difference in terms of speed of understanding without having to hunt through off-line material and papers. [5] Using Trax I think is not a good idea for this course. The documentation is kind of non-existent and lot of details of functions are hidden and the only way to understand them is to look at the code. A more established framework like Tensorflow or pytorch would have been much more helpful.

Overall a disappointment given the quality of other courses available from Coursera.

创建者 Laurence G

Apr 11, 2021

Pros: Good choice of content coverage. Provides a historic overview of the field, covering the transition from early work on seq2seq with LSTMs, through the early forays into Attention, to the more modern models first introduced in Veswani et al. Week 4 covers the Reformer model which was quite exciting. Decent labs

Cons: Videos aren't great, there are a lot of better resources out there, many actually included in the course's reference section. Trax is not a good framework for learners in comparison to Pytorch, but if you plan on using TPUs and appreciate the pure functional style and stack semantics then it's worthwhile. The labs can be a bit copy-pasty. Some of the diagrams are awful - find other resources if this is a problem.

Overall: I'd probably rate this course a 3.5 but wouldn't round up. The videos really let things down for me, but I persisted because the lesson plan and labs were pretty good.

创建者 Christine D

Jan 22, 2021

Even though the theory is very interesting, and well explained the videos dive too deep in certain concepts without explaining the practical things you can do with them too very well.

The practical stuff, especially the graded assignments, are very centered around Trax, and the only things you have to know and understand are basic python and logic. You don't really get to make your own stuff, you just fill in stuff like "temperature=temperature" or "counter +=1".

I preferred and recommend the first two courses in this NLP-specialization.

创建者 Rishabh S

Sep 18, 2021

T​he course is very research oriented and not very useful for data science practitioners. No time was spent on explaining how transformers can be used for NLP tasks using a small domain or company specific corpus through transfer learning. I'm not planning to develop the next blockbuster NN architecture for NLP and so the intricate details of how transformer and reformer works seemed like an overkill. Lastly, using Trax instead of the more production ready frameworks like Tensorflow also made it feel very research focussed.

创建者 Azriel G

Nov 20, 2020

The labs in the last two courses were Excellent. However the lecture videos were not very useful to learn the material. I think the course material deserves a v2 set of videos with more in depth intuitions and explanations, and details on attention and the many variants, etc. There is no need to oversimplify the video lectures, it should feel as similar level as the labs (assignments tend to be "too easy" but I understand why that is needed). Thanks for the courses. Azriel Goldschmidt

创建者 Thomas H

May 21, 2021

While the course succeeds in getting the most important points across, the quality of both the video lectures and the assignments is rather disappointing. The more detailed intricacies of attention and transformer models are explained poorly without providing any intuition on why these models are structured the way they are. Especially the lectures on current state-of-the-art models like BERT, GPT and T5 were all over the place and didn't explain these models well at all.

创建者 Kota M

Aug 23, 2021

This course perhaps gives a good overview of the BERT and several other extensions such as T5 and Reformer. I could learn the conceptual framework of the algorithms and understood what we can do with them. However, I think the instructors chose an undesirable mix of rigour and intuition. The lectures are mostly about intuition. In contrast, the assignments are very detailed and go through each logical step one by one.

创建者 Zhuo Q L

Jul 4, 2021

It is exciting to learn about the state of the art approach for NLP, but as the last course of the specialization, one can feel that the quality/level of details of descriptions just dropped significantly. I like how the course introduces useful things like SentencePiece, BPE, and interesting applications, but some of them felt abrupt and wasn't elaborated.

创建者 Dan H

Apr 5, 2021

Pros: Good selection of state of the art models (as of 2020). Also great lab exercises.

Cons: The video lectures and readings are not very helpful. Explanations about the more tricky parts of the models and training processes are vague and ambiguous (and some times kind of wrong?). You can find more detailed and easier to understand lectures on Youtube.

创建者 dmin d

Jan 7, 2021

Have to say, the instructor didn't explain the concept well. A lot of explanation doesn't make sense, or just give the final logic and skip all the details. I need to search on youtube or google to understand the details and concept.

But, it covers state-of-art models for NLP. It's a good starting point and helped save time.

创建者 Oleksandr P

Apr 4, 2021

Although this course gives you understanding about the cutting edge NLP models it lacks details. It is hard to understand a structure of the complex NLP model during the few minute video. This course should have step by step explanations in the bigger number of lectures or increase their duration.

创建者 Nunzio V

Apr 7, 2021

Nice course. Full of very interesting infomation. What a pity not having used Tensorflow. All that knowledge is unfortunately not work-ready as Trax is not widespreadly used in the industry world and it is hardlyit will ever be. In my opinion.

创建者 Семин А С

Aug 9, 2021

Explanation of Attention models with Attention mechanism itself and other building blocks of the Transformers was very confusing. It was really hard sometime to udnerstand what the lecturer really meant.

创建者 Michel M

Feb 9, 2021

The presented concepts are quite complex - I would prefer less details as most will not understand them anyway and more conceptual information why these models are build as they are

创建者 Zeev k

Oct 24, 2021

not clear enough. the exersices warent good enough' i didn't learned from them much. it could be a great idea to give the slides at the end of every week for reapet.

创建者 Huang J

Dec 23, 2020

Course videos are too short to convey the ideas behind the methodology. Illustration is too rough.

创建者 Maury S

Mar 13, 2021

Another less than impressive effort in a specialization from which I expected more.

创建者 martin k

Apr 26, 2021

Low quality programming assignments, but considering the price it's good overall

创建者 Prithviraj J

Dec 21, 2020

Explanations of attention/self-attention & other complex topics are too shallow