Chevron Left
返回到 Natural Language Processing with Sequence Models

学生对 提供的 Natural Language Processing with Sequence Models 的评价和反馈

892 个评分
178 条评论


In Course 3 of the Natural Language Processing Specialization, you will: a) Train a neural network with GLoVe word embeddings to perform sentiment analysis of tweets, b) Generate synthetic Shakespeare text using a Gated Recurrent Unit (GRU) language model, c) Train a recurrent neural network to perform named entity recognition (NER) using LSTMs with linear layers, and d) Use so-called ‘Siamese’ LSTM models to compare questions in a corpus and identify those that are worded differently but have the same meaning. By the end of this Specialization, you will have designed NLP applications that perform question-answering and sentiment analysis, created tools to translate languages and summarize text, and even built a chatbot! This Specialization is designed and taught by two experts in NLP, machine learning, and deep learning. Younes Bensouda Mourri is an Instructor of AI at Stanford University who also helped build the Deep Learning Specialization. Łukasz Kaiser is a Staff Research Scientist at Google Brain and the co-author of Tensorflow, the Tensor2Tensor and Trax libraries, and the Transformer paper....



Sep 27, 2020

Overall it was great a course. A little bit weak in theory. I think for practical purposes whatever was sufficient. The detection of Question duplication was a very much cool model. I enjoy it a lot.


Nov 11, 2021

This is the third course of NLP Specialization. This was a great course and the instructors was amazing. I really learned and understand everything they thought like LSTM, GRU, Siamese Networks etc.


26 - Natural Language Processing with Sequence Models 的 50 个评论(共 187 个)

创建者 Julian D

Sep 1, 2021

I​n comparison to the earlier courses I found rewriting the generator each assignment a little bit repetitive. In contrast, I had to understand little of the Tripleloss to complete the task as it was very specific in the instructions. Either don't make the loss function this intricate, provide it or let the learners try hard a little but this way it is a bit scripted instead of figuring stuff out. Overall, I love the series and would encourage to extend it to include some background knowledge. Like 10 % more math :-).

创建者 Li Z

Sep 16, 2020

LSTM explanation is not very clear. Have to revisit some external links. The coding exercises are frustrating, even run properly step by step, got much glitches when submitting them. Time spent on fixing the submission issues is longer than taking the lessons.

创建者 Paul J L I

Oct 23, 2020

There were a lot of strange errors. Issue with the model. Weird language about elementwise vector addition (all vector addition is elementwise). Quizzes with incorrect language.

创建者 Corine M

Sep 6, 2020

The excercises were very easy and did not really give the understanding of NLP.

创建者 Rutvij W

Sep 16, 2020

the course could be more in depth

创建者 Vincent R

Jan 12, 2022

Very superficial information about different types of neural networks and their uses. Use of Trax makes it nearly impossible to Google anything helpful - a lot of the assignments just tell you to read the documentation. To finish the assignments, you can basically copy-paste the code you're given to set up Trax neural networks and generators without having any idea what you're doing (because, again, the content doesn't go over it).

For example, Week 1 in this course introduces you to Trax (can't be run on Windows), covers aspects of object-oriented programming, and talks at a very high level about how to do things in Trax before moving onto a cursory discussion of generators. Then the assignment has you increment counters, set limits in FOR loops, copy negative-sentiment code from positive-sentiment code that's already completed, and fill in some code that's basically given right above where you write it.

Overall, any code you write is usually very simple but often easy to get wrong because of the lack of direction, e.g. model(x,y) vs model((x,y)). The discussion boards are invaluable because the mistake might have been 3 functions earlier that the built-in tests didn't catch. It feels like a good effort was put in all around to establish this course, but it feels like a first draft course that was never updated.

创建者 Julio W

Jun 28, 2021

I​ learn to hate Trax in this course. In the assignements, Trax is used only for toy problems, and then, we use a precomputed model. Even the fastnp is used in a very slow mode. Why to learn NLP in a obscure and really bad documented framework if we will finish to use a precomputed models?

Moreover, I try to replicate the results in my own machine (or even in colab) it does not work, because Trax change a lot between versions. Again, why to use, in a course, a framework that is not stable?

I​n my opinion, using a new and obscure framework to tech new concepts, only because you love it, is (at least) antipedadogical.

创建者 DANG M K

Aug 29, 2020

This course material is not good compare to the Deeplearning Specialization. I hope the instructor will write down to explain detail, not just reading from slide

创建者 bdug

Apr 23, 2021

I was disapointed by this course:

I did not like at all the use of Trax. At our level (student), we need a well established and documented library like Keras or Pytorch to illustrate the concepts. Trax is badly documented. And since the installation of the Trax version used in the assignement fails in Google Colab (!!), I had hard time reproducing the assignements in google colab.

Week 3 is just a scam since it says "go and read this blog" or "watch this video in another specialization". At that moment I simply felt robbed.

创建者 Dimitry I

Apr 14, 2021

Very superficial course, just like the rest in the specialization. Quizzes and assignments are a joke. Didn't want to give negative feedback at first, but now that I am doing course #4 in the specialization, which covers material I don't know much about (Attention), I've realized how bad these courses are. Very sad.

创建者 Yuri C

Jan 2, 2021

Among the first three courses of the NLP specialization, this is by far the most exciting. I enjoyed very much all the four weeks and the learning syllabus as a whole! Although many complained about the use of trax as a DL framework, I must say, I found fantastic to be able to learn it from people involved in the development! This per se is already an A+. I congratulate the team in taking this decision and pushing this forward. Trax is intuitive and *very* elegant. Chapeau for the devs! If it is as performant as they say for large data sets, this is the future and I am very pleased that the instructors decided to prepare us for it. Apart from all this positive side, I saw in this third course again some content at the end of the assignments that was not introduced during the corresponding week. For example, the Gumbel sampling at the end of Week 2. This was not a graded exercise, therefore it is not a major problem. Nevertheless, it comes out of the blue for the student and it is hard to connect the dots and understand why are we performing this operation at all for the text generation. So, there are a couple of loose threads here and there along the course. But it is a minimal problem. On the other hand, the presentation and discussion of the sequential models in all 4 weeks are very good, again an optimal balance between mathematical formalism, intuition and ease to code. Moreover, the choice of applications in the four week are just right, classification, generation, NER and one shot learning. All in all an awesome package, congratulations!

创建者 John Y

Jan 6, 2022

This was another great course. I previously put on my to do list learning or reviewing about classes and I was happy to see it covered here. I enjoyed learning about data manipulation, sampling, and iteration or generation process and Trax. At first I was a little hesitant about learning a new program or library like Trax but I found Lukas' talk to be helpful and convincing. I feel Trax does simplify the coding process quite nicely. The homework seemed repetitive but I found that approach to be very useful because I think the intent was to help us familiarize with the coding process and Trax more quickly. I previously completed the DL Specialization and appreciated this course very much. Imo, someone new to DL and RNN might find this course confusing because the concepts are not explained as much in depth as in the DL Course.

创建者 Nishant M K

Apr 5, 2021

Great course! I needed to check in on some of the discussions in the discussion forums for this one, so the discussion forums are especially useful (for assignments for weeks 3 and 4). As in the first 2 courses in this specialization, this one also adds most value in its 'lab' and assignment Jupyter notebooks. The videos serve as a gentle introduction to the topics and the concepts from the lectures are emphasized upon in the assignments/labs. Great introductory course overall!

创建者 James M

Dec 15, 2021

Very good course. The only issue I have is when you have some questions about the code or you have issues if no one else has your problem you seem to be on your own. Sometimes I had just some conceptual coding questions and you can't ask why the code is doing what it is doing. I did learn a lot and for the price it is still worth it.

创建者 Dustin Z

Nov 14, 2020

A really good and detailed course on sequence models. This was definitely the most challenging course in the specialization so far in part because of the use of the Trax framework. I really enjoyed reading the source code of Trax and understanding how the ML framework was constructed. This was a very unique part of this course.


Jan 29, 2021

Excellent course, I would like to learn a little more to know how to adjust the classification threshold in the Siamese network, tuning of parameters in the LSTM network, and how to solve common error problems in the models performance. This course is a good base to introduce you to the sequence models.

创建者 Ram N P

Feb 20, 2022

Would have been more useful if all the code snippets, labs, assignments were in Tensorflow or Pytorch. I understand that Trax is more easy to use and deploy. But untill companies really start using this library it is of very less benefit to learners.

创建者 Sarwar A

Sep 28, 2020

Overall it was great a course. A little bit weak in theory. I think for practical purposes whatever was sufficient. The detection of Question duplication was a very much cool model. I enjoy it a lot.

创建者 Ahammad U

Nov 12, 2021

This is the third course of NLP Specialization. This was a great course and the instructors was amazing. I really learned and understand everything they thought like LSTM, GRU, Siamese Networks etc.

创建者 Nikesh B

Aug 7, 2020

Awesome course!! Younes explains all the concepts very nicely :) I enjoyed this course a lot and learned many new things, which I am planning to use in my current project. Thanks a lot, Younes

创建者 Hieu D T

Apr 24, 2021

This course is much more difficult than the 2 previous ones in the series. Not because of the way instructor transferring but in the knowledge itself. Totally worth taking this course

创建者 Sebastián G A

Nov 4, 2020

Excellent course on sequence models and how to solve problems in industry and academia with them. Beautifully structured assignments and well-explained lectures, quite enjoyable!

创建者 Christopher R

Mar 21, 2021

I wish the neural networks would be described in greater detail.

Everything else is really nice, Younes explains very well. Assignments are very nicely prepared.

创建者 Sabita B

Apr 23, 2021

amazing course. material is very well presented and explained! really loved the data generator part of the code - really drilled in the importance of it!

创建者 Shaida M

Feb 19, 2021

Interesting course. I like this specialization very much. I don't understand why one instructor introduces the topic and another instructor explains it.