Chevron Left
返回到 Natural Language Processing with Attention Models

学生对 deeplearning.ai 提供的 Natural Language Processing with Attention Models 的评价和反馈

4.3
685 个评分
170 条评论

课程概述

In Course 4 of the Natural Language Processing Specialization, you will: a) Translate complete English sentences into German using an encoder-decoder attention model, b) Build a Transformer model to summarize text, c) Use T5 and BERT models to perform question-answering, and d) Build a chatbot using a Reformer model. By the end of this Specialization, you will have designed NLP applications that perform question-answering and sentiment analysis, created tools to translate languages and summarize text, and even built a chatbot! Learners should have a working knowledge of machine learning, intermediate Python including experience with a deep learning framework (e.g., TensorFlow, Keras), as well as proficiency in calculus, linear algebra, and statistics. Please make sure that you’ve completed course 3 - Natural Language Processing with Sequence Models - before starting this course. This Specialization is designed and taught by two experts in NLP, machine learning, and deep learning. Younes Bensouda Mourri is an Instructor of AI at Stanford University who also helped build the Deep Learning Specialization. Łukasz Kaiser is a Staff Research Scientist at Google Brain and the co-author of Tensorflow, the Tensor2Tensor and Trax libraries, and the Transformer paper....

热门审阅

JH
Oct 4, 2020

Can the instructors make maybe a video explaining the ungraded lab? That will be useful. Other students find it difficult to understand both LSH attention layer ungraded lab. Thanks

LL
Jun 22, 2021

This course is briliant which talks about SOTA models such as Transformer, BERT. It would be better to have a Capstone Project. And entire projects can be downloaded easily.

筛选依据:

26 - Natural Language Processing with Attention Models 的 50 个评论(共 170 个)

创建者 RKX

Sep 24, 2021

It would be better using TensorFlow as an implementation tool of these cutting edge algorithms for its popularity both in academia and industry.

创建者 Evan P

Mar 22, 2021

Dr. Ng's Deep Learning specialization is so good: 5 stars. For me, this course was not nearly as good as the courses in that specialization. I felt like I could have just read the papers on BERT, GPT-2, T5, and the Reformer, and would have learned the same amount. The one exception was the lecture video on the history of Transformers (the evolution from ELMO->BERT->T5, etc.). Also the ungraded Reformer labs; those were good. But I personally didn't get very much value out of all the other lectures and labs.

创建者 Jean-Luc B

Nov 8, 2020

Maybe my fault but at some point in these courses I got lost in the logic and the whys of the networks constructions. I managed the assignments because for some to pass you only need to know how to copy and paste.

But I reckon the great value of the material, I think I'll need to revisit and spend more time on the optional readings.

And still overall a great specialization, thanks to all the persons involved in these courses !

创建者 Israel T

Oct 7, 2020

Very educational! I learned a lot about the different NLP models. However, it seems like week 3 and week 4 were rushed. Also, some of the items (e.g. what each layers does and why do we need that layer) were not properly explained. Other than that, this is a good course to have a general overview on some of the state of the art NLP models.

创建者 Mark L

Oct 2, 2021

(1) Please consider switching from Trax to Tensorflow. (2) The concepts of Transformers, particularly some explanation of why Q, K and V are called such, would be helpful to go over in more detail. (3) Not a problem of the course, but it would be helpful if the Trax documentation were more complete.

创建者 Felix M

Apr 11, 2021

The classes originally taught by Andrew were for me much better. Many of the explanations in this course were not very clear and superficial as I see it.

创建者 Tianpei X

Nov 1, 2020

the homework is way too simplified esp. in week 3 and week 4. My impression is that the ungraded lab was actually the real homework but was put aside to allow more people to pass. That is not a good compromise.

创建者 Valerio G

Mar 24, 2021

I'm very disappointed with the whole deeplearning.ai NLP specialization in general, but this course was icing on the cake.

The course treats advanced and state-of-art techniques in NLP with neural neutworks, but the theoretical lectures are confusing and imprecise. The framework programming assignments are totally useless, since the user is asked to implement the network architectures discussed in the lectures using a "fill the dots" approach with a very restrictive starter structure. In my personal experience, this yielded a close-to-zero learning outcome, but a lot of frustration in trying to get around some bugs in the auto-grading system, by desperately browsing in the posts from the learners community.

I came here after the very nice Deep Learning Specialization held by Andrew Ng and wasn't expecting this.

创建者 Siddharth S

Sep 19, 2021

TRAX absolutely made it super hard to learn and follow.

If it was explained using Tensorflow or Pytorch it would have been very beneficial.

创建者 Rabin A

Apr 19, 2021

The course was pretty good. It introduced me to the state-of-the-art algorithms and techniques needed to have a sound understanding of NLP. One thing I didn't like about the teaching method in the whole specialization is that Younes was the one teaching the course content to us but Łukasz talked as if it was he giving some of the lectures, although we could clearly find out it's Younes from his voice. Thanks especially to Younes for doing all the hard work for the specialization. You deserve a 5 star.

创建者 Dustin Z

Dec 17, 2020

A very good and detailed course. Definitely the most challenging course I have taken by DL.ai. Gives a good overview of Transformers, the current cutting-edge of NLP models. Also, provides great insight into Trax, Google Brain's ML framework, which was helpful in understanding how deep learning frameworks are built. One of the teachers is one of the authors of Trax!

创建者 Ganesh s m

Oct 10, 2020

Every week's assignment brings a new challenge and it was fun to complete the assignments. Course Instructors explain concepts very well. This course teaches you from the beginner level to a professional level. Covers every topic related to NLP. I enjoyed learning NLP with Deeplearning.ai. I would like to thank deeplearning.ai for making this course.

创建者 Huu M T H

Sep 30, 2020

Good course in overall. The last two weeks' assignment is a little bit too light. The instructor could introduce more about loading pretrained models and fine-tune them as it is a popular practice nowadays for small companies with limited resources (data/computation). Introduction to "easy-to-use" framework such as huggingface is highly recommended.

创建者 Rajendra A

Dec 30, 2020

This specialization covers from NLP basics to the advance models currently being used. All the programming assignments, contents and sessions were thoughtful. Exposure to Trax library and learning experience was really excellent. Thanks to the entire team of this specialization and coursera team.

创建者 Long L

Nov 18, 2020

Thank you Coursera and the DeepLearning.AI team. The moment I set foot on this journey I did not think I would love NLP so much. The course is very informative: it teaches NLP from the very first naive algorithm to the State-of-the-art models today.

创建者 Bharathi k N

Oct 12, 2020

The course is so good and well presented. I really enjoyed the whole specialization. Thank you for this amazing course and the whole specialization which that me a lot. Thank you Andrew NG and deeplearning.ai team for this amazing specialization.

创建者 Alan K F G

Oct 21, 2020

I learnt a lot about Transformers and Reformers which belong to the most advenced models for NLP tasks. The instructors were fully prepared though I'd prefer to see more animations in following courses. Thank you so much for spreading knowledge!

创建者 Muhammad T W

Jun 12, 2021

This course has helped me a lot in developing my NLP skills and now I am confident that I can solve NLP problems easily because both the instructors Younes and Luckerz has thought this course in a way that it can be absorbed in any NLP problem.

创建者 Patrick A

Nov 26, 2020

An excellent course that covers research that was published about two months early.

It doen't get more cutting edge than that, and the technology (reversible residual layers) is immediately applicable and a very powerful enabler.

Thanks a lot!

创建者 vadim m

Oct 17, 2020

An amazing level of breadth and depth of the material presented. State of the art techniques are exemplified via carefully crafted lab assignments with sufficient hints for students to be able to comprehend hard technical concepts.

创建者 Jim F

Feb 28, 2021

Thanks for setting out to do the impossible and creating this set of courses. You have opened a doorway to understanding where the state of the art is. The rest is up to me. That's the purpose of education.

创建者 Simin F

Nov 27, 2020

Helpful and Interesting! This course leads me gradually understand how transformer works and being optimized along with several models without much confusions. Great thanks for the deeplearning.ai Team!!

创建者 lonnie

Jun 23, 2021

T​his course is briliant which talks about SOTA models such as Transformer, BERT. It would be better to have a Capstone Project. And entire projects can be downloaded easily.

创建者 SNEHOTOSH K B

Nov 21, 2020

The course is a very comprehensive one and covers all state-of-the-art techniques used in NLP. It's quite an advanced level course and a good python coding skill is a must.

创建者 Adrien B

Apr 12, 2021

Quite complete course on the (current) state of the art of NLP. Interesting assigment using Trax. Some good introduction about big NLP model like GPT/T5/BERT.