Chevron Left
返回到 Natural Language Processing with Sequence Models

学生对 提供的 Natural Language Processing with Sequence Models 的评价和反馈

847 个评分
169 条评论


In Course 3 of the Natural Language Processing Specialization, you will: a) Train a neural network with GLoVe word embeddings to perform sentiment analysis of tweets, b) Generate synthetic Shakespeare text using a Gated Recurrent Unit (GRU) language model, c) Train a recurrent neural network to perform named entity recognition (NER) using LSTMs with linear layers, and d) Use so-called ‘Siamese’ LSTM models to compare questions in a corpus and identify those that are worded differently but have the same meaning. By the end of this Specialization, you will have designed NLP applications that perform question-answering and sentiment analysis, created tools to translate languages and summarize text, and even built a chatbot! This Specialization is designed and taught by two experts in NLP, machine learning, and deep learning. Younes Bensouda Mourri is an Instructor of AI at Stanford University who also helped build the Deep Learning Specialization. Łukasz Kaiser is a Staff Research Scientist at Google Brain and the co-author of Tensorflow, the Tensor2Tensor and Trax libraries, and the Transformer paper....


Sep 27, 2020

Overall it was great a course. A little bit weak in theory. I think for practical purposes whatever was sufficient. The detection of Question duplication was a very much cool model. I enjoy it a lot.

Nov 11, 2021

This is the third course of NLP Specialization. This was a great course and the instructors was amazing. I really learned and understand everything they thought like LSTM, GRU, Siamese Networks etc.


126 - Natural Language Processing with Sequence Models 的 150 个评论(共 175 个)

创建者 Oleksandr P

Apr 4, 2021

This course is good but it is too short in my opinion. It is sometimes hard to wrap your head around some concepts that are describe in a 5 minute video. I think this course should have more video lectures with a more detailed (step by step) explanations.

创建者 Pradeep B

Aug 28, 2021

The topics are definitely advanced however the content is very basic and is meant for beginners if I am not wrong. If one is 'starting' to learn to apply deep learning via trax to sequence models, then this is the best course for that goal.

创建者 Ahnaf A K

Aug 6, 2020

It was a bit repetitive of the 'Sequence Model' course from the Deep Learning specialization, only with the exception of implementing in TRAX.

创建者 Nishank L

Nov 14, 2021

Assignments are good. Can we have these using pytorch. Or better: Can a person choose his own language and build entire code on that !!

创建者 Osama A O

Oct 19, 2020

Great course, although would have been better if assignments were implemented in Keras or PyTorch. Otherwise, definitely worth it!

创建者 Matthew P

Jan 7, 2021

Great information, but some of the assignments had errors and there weren't many interactions from the TAs on the Slack or Forum

创建者 Mohsen A F

Oct 24, 2020

The clarity of exposition was superb! 1 star less for using TRAX. I would have rathered to use Keras or Tensorflow.

创建者 Saurabh K

May 24, 2021

We might have included little bit more details on dimensions of the inputs and outputs of the Sequence models.

创建者 Mridul G

Jul 14, 2021

T​he course is very good, but its not complete in itself. The way course was taken and everything is good.

创建者 Hair P

Nov 20, 2020

Overall the content was great. Please make sure that errors in the notebooks are corrected.


Sep 18, 2020

The course is designed quite well to boost understanding of Sequence Models in great depth

创建者 Steve H

Apr 3, 2021

Excellent course, but probably worth doing the deep learning specialisation first!

创建者 Ke Z

Feb 24, 2021

I dont like to use TRAX. If it is using tensorflow, then I will give 5 stars

创建者 Alireza S

Dec 11, 2021

I prefer that the lecturer using TensorFlow instead of Trax for exercises

创建者 Vitalii S

Jan 21, 2021

Good information, but some assignments were an embarrassment.

创建者 Nikita M

Dec 7, 2020

Not as good as original courses by Andrew

创建者 Gonzalo A M

Jan 14, 2021

it was good but it could be better

创建者 Ruiwen W

Aug 1, 2020

some errors in the assignments

创建者 V B

Sep 24, 2020


创建者 JJ Y

Sep 26, 2020

Sequence models are heavy subjects and it would be unrealistic to expect a 4-week course to go into all the depths of RNN, GRN, LSTM etc. This course does a great job covering important types of neural networks and showing their applications. However, the labs and assignments could have done more in (a) helping us look a little deeper into the implementations of different NN building components, and (b) aligning better with the lecture videos.

Really Good examples: Week1 labs and assignment illustrate the implementations of some of the basic layer classes, and outline the overall flow of NN training with Trax. Week4 labs and assignment illustrate the implementation of the loss layer based on the unique triple loss function.

Not so Good examples: Week1 uses a whole video explaining gradient calculation in Trax. Yet there is no illustration of how it's integrated in backward propagation in Trax. Week2 videos and the labs/assignment are more disjoint. There is a video explaining the scan() function, but it does not show up in the assignment at all.

创建者 Amlan C

Oct 9, 2020

Despite the theoretical underpinings I do not feel this course lets you write an NER algo on your own . Majority of these courses have been using data Whats supplied by coursera and so is the case with models. In real life we have to either create this data or use some opensource data like from kaggle or whatever. I think it'd be better if we orient the course using publicly available appropriate data and models trained by students to be used for actual analysis.

创建者 Maury S

Mar 8, 2021

Like some of the other courses in this specialization, this one has promise but comes off as a so far somewhat careless effort compared to the usual quality of content from Andrew Ng. The lecturers are OK but not great, and it is unclear what the role of Lukasz Kaiser is beyond reading introductions to many of the lecture. There is a strange focus on simplifying with the Google Trax model at the cost of not really teaching the underlying maths.

创建者 Business D

Dec 14, 2020

I regret a lack of proper guidance in the coding exercises, compounded with the incomplete documentation of the trax library. I also feel we could build models with greater performance. An accuracy of 0.54 for the identification of question duplicates doesn't seem to be the state of the art...

You could do better!

创建者 Huang J

Dec 23, 2020

The course videos are too short to convey the ideas behind the methodology. It requires understanding of the methodology before following the course material. Also, the introduction on Trax is fine, but would prefer to have a version of the assignments on TensorFlow.

创建者 A V A

Nov 13, 2020

Good course teaching the applicatons of LSTMs/GRUs in language generation, NER and for matching question duplicates using Siamese networks. Would have been more helpful if there was more depth in the topics.