Chevron Left
返回到 Natural Language Processing with Sequence Models

学生对 提供的 Natural Language Processing with Sequence Models 的评价和反馈

839 个评分
169 条评论


In Course 3 of the Natural Language Processing Specialization, you will: a) Train a neural network with GLoVe word embeddings to perform sentiment analysis of tweets, b) Generate synthetic Shakespeare text using a Gated Recurrent Unit (GRU) language model, c) Train a recurrent neural network to perform named entity recognition (NER) using LSTMs with linear layers, and d) Use so-called ‘Siamese’ LSTM models to compare questions in a corpus and identify those that are worded differently but have the same meaning. By the end of this Specialization, you will have designed NLP applications that perform question-answering and sentiment analysis, created tools to translate languages and summarize text, and even built a chatbot! This Specialization is designed and taught by two experts in NLP, machine learning, and deep learning. Younes Bensouda Mourri is an Instructor of AI at Stanford University who also helped build the Deep Learning Specialization. Łukasz Kaiser is a Staff Research Scientist at Google Brain and the co-author of Tensorflow, the Tensor2Tensor and Trax libraries, and the Transformer paper....


Sep 27, 2020

Overall it was great a course. A little bit weak in theory. I think for practical purposes whatever was sufficient. The detection of Question duplication was a very much cool model. I enjoy it a lot.

Nov 11, 2021

This is the third course of NLP Specialization. This was a great course and the instructors was amazing. I really learned and understand everything they thought like LSTM, GRU, Siamese Networks etc.


101 - Natural Language Processing with Sequence Models 的 125 个评论(共 175 个)

创建者 yeha

Sep 16, 2020

can't wait course 4

创建者 Balaji V

Feb 20, 2021

Excellent Content!

创建者 Deleted A

Sep 5, 2020

Very useful !!!!

创建者 Mohammad B A

Dec 27, 2020

I am so happy

创建者 Jyotin P

Nov 20, 2020

Amazing course

创建者 Sohail Z

Sep 7, 2020


创建者 Jose L L d J S

Sep 17, 2021


创建者 Chen

Oct 27, 2021

Thank you!

创建者 Onuigwe V

Aug 29, 2020


创建者 Zoizou A

Oct 25, 2020


创建者 Yongxin W

Oct 2, 2020

so cool

创建者 Rifat R

Sep 23, 2020


创建者 Jeff D

Nov 15, 2020



Apr 22, 2021


创建者 Ricardo F

Jan 15, 2021


创建者 M n n

Nov 22, 2020


创建者 Saoudi H

Sep 27, 2020


创建者 Dave J

Feb 15, 2021

There are lots of good points. The instructors are knowledgeable, Lukasz Kaiser is one of the authors of Tensorflow and Trax. The material is generally presented in a clear way. The labs and assignments work smoothly. You learn how to implement significant NLP tasks in a modern framework (Trax).

There are areas where I felt the course could have been better.

The amount of taught material is only about half an hour of lecture per week. I felt that it covered the bare minimum to get you through the assignments but I would have liked a lot more content, going in more depth into the concepts and how the performance of the models discussed compares to state-of the-art models and how it could be improved.

Having already done the Deep Learning Specialization, I was disappointed that this course did not build on that as a foundation. There is a lot of overlap between this course and course 5 of the DL Specialization, Sequence Models. To me, it would have made more sense to make that course a prerequisite, thus avoiding all the duplicated material and instead going beyond it.

The other area where there's some room for improvement, though it's not at all bad, is the teaching style, which is mostly reading from a script. I would like to see more effort to engage with the learner and think about what they might need to progress on their learning journey. For example, discuss the strengths and weaknesses of an approach and where it fits into the history and state of the art of the subject; anticipate questions or likely misunderstandings and try to cover them or point to supplementary material.

Overall, a good course that could have been a great course.

创建者 D. R

Mar 22, 2021

I'm a master/graduate student who took an NLP course in Uni.

I think that overall this is a very a good introduction to the topic. Some concepts are really well explained - in a simple manner and with a lot of jupyter-lab code to experiment with.

In general in this specialization - the first 3 courses are good. There are some quirks (e.g. why Lukas is needed at all? He doesn't really teaches, just passes it on to Younes) but nevertheless I learned from it. And I think they have good value in them.

The 4th one, however, is completely disappointing. First 2 "weeks" are confusing, not really well explained, but somewhat "bearable". The last 2 weeks are complete sham. They claim to teach "BERT" and "T5" but don't really give any value. You're better off going elsewhere to learn these concepts.

If it wasn't for this, I would give the overall experience a 5 stars, but because of this, I think the overall is more like 3 or 4.

创建者 Kostyantyn B

Nov 15, 2020

The course is quite informative and it focuses on some cutting edge developments in NLP, which is great. I also really appreciated how well the instructors managed to explain the important concepts of GRU and LSTM. However, I wish the assignments were a bit more challenging. Most of the time, they felt like step-by-step instructions that are almost impossible to get wrong, with not much room for imagination. Good for self-esteem, not so good for skill building... Still, this was by no means a waste of time. A good foundational course that leaves you hungry for more. So perhaps it was the instructors' intention all alone to make it this way :)

创建者 Laurence G

Mar 22, 2021

Better then the first two courses. Excellent week 1 introduction to Trax - I especially enjoyed Lukasz video about the origins, along with the links to source code and extra readings. Much of the technical content in weeks 2-4 is better covered in the deep learning specialization, however it's fairly brief and demonstrated using Trax so I still learnt something new. The applications in the assignments are interesting - the comparison between RNNs and n-gram models when doing text generation, parts of speech tagging and the question answer duplicate detection with Siamese models. Also got to try a few new things with numpy which was nice.

创建者 Saurabh D

Aug 11, 2020

To begin with, the course is very well structured and the assignments make you apply the theory what you have learnt in the videos in an effective way. The community on slack is very helpful if you need any help. The only thing that I didn't like was the course assignments were using *trax* for building the models instead of powerful frameworks like tensorflow, pytorch. That's the only reason I am rating 4 instead of 5 stars. Overall it is a pretty nice course and you will find it very easy if you have completed the Sequence models course from Andrew Ng's Deep learning specialization.

创建者 Feng J

Feb 7, 2021

This is a great course for natural language processing ! The video is short buy very precise for the concept. I think this is a middle level course, so one should already have the basic knowledge of deep neural network, and python skill. Then you will enjoy this journey. I hope for a more freedom style in coding assignment, rather than fill in the None parts style. Then we could obtain a solid knowledge by the deep practice. All in all, this is a great NLP course !!!

创建者 Galangkangin g

Aug 7, 2021

Material was good, but the assignments were too hand-holding. We were told what to do on every step of the algorithm. I think it's better to give an almost empty signature function and describe what we should create for that function (input/output) so we can gain more understanding