Chevron Left
返回到 Natural Language Processing with Sequence Models

学生对 提供的 Natural Language Processing with Sequence Models 的评价和反馈

892 个评分
178 条评论


In Course 3 of the Natural Language Processing Specialization, you will: a) Train a neural network with GLoVe word embeddings to perform sentiment analysis of tweets, b) Generate synthetic Shakespeare text using a Gated Recurrent Unit (GRU) language model, c) Train a recurrent neural network to perform named entity recognition (NER) using LSTMs with linear layers, and d) Use so-called ‘Siamese’ LSTM models to compare questions in a corpus and identify those that are worded differently but have the same meaning. By the end of this Specialization, you will have designed NLP applications that perform question-answering and sentiment analysis, created tools to translate languages and summarize text, and even built a chatbot! This Specialization is designed and taught by two experts in NLP, machine learning, and deep learning. Younes Bensouda Mourri is an Instructor of AI at Stanford University who also helped build the Deep Learning Specialization. Łukasz Kaiser is a Staff Research Scientist at Google Brain and the co-author of Tensorflow, the Tensor2Tensor and Trax libraries, and the Transformer paper....



Sep 27, 2020

Overall it was great a course. A little bit weak in theory. I think for practical purposes whatever was sufficient. The detection of Question duplication was a very much cool model. I enjoy it a lot.


Nov 11, 2021

This is the third course of NLP Specialization. This was a great course and the instructors was amazing. I really learned and understand everything they thought like LSTM, GRU, Siamese Networks etc.


101 - Natural Language Processing with Sequence Models 的 125 个评论(共 186 个)

创建者 Divya S

Sep 19, 2021

I​ loved this course :)

创建者 Mario A C F

Jun 26, 2021

A​mazing Experience!

创建者 Esakki p E

May 21, 2021

Best place to learn

创建者 yeha

Sep 16, 2020

can't wait course 4

创建者 Balaji V

Feb 20, 2021

Excellent Content!

创建者 Kushagra P

Mar 11, 2022

Execellent course

创建者 Deleted A

Sep 5, 2020

Very useful !!!!

创建者 Mohammad B A

Dec 27, 2020

I am so happy

创建者 Jyotin P

Nov 20, 2020

Amazing course

创建者 Sohail Z

Sep 7, 2020


创建者 Jose L L d J S

Sep 17, 2021


创建者 larawang

May 7, 2022

Thank you!

创建者 Chen

Oct 27, 2021

Thank you!

创建者 Onuigwe V

Aug 29, 2020


创建者 Zoizou A

Oct 25, 2020


创建者 Yongxin W

Oct 2, 2020

so cool

创建者 Rifat R

Sep 23, 2020


创建者 Jeff D

Nov 15, 2020



Apr 22, 2021


创建者 Ricardo F

Jan 15, 2021


创建者 M n n

Nov 22, 2020


创建者 Saoudi H

Sep 27, 2020


创建者 Dave J

Feb 15, 2021

There are lots of good points. The instructors are knowledgeable, Lukasz Kaiser is one of the authors of Tensorflow and Trax. The material is generally presented in a clear way. The labs and assignments work smoothly. You learn how to implement significant NLP tasks in a modern framework (Trax).

There are areas where I felt the course could have been better.

The amount of taught material is only about half an hour of lecture per week. I felt that it covered the bare minimum to get you through the assignments but I would have liked a lot more content, going in more depth into the concepts and how the performance of the models discussed compares to state-of the-art models and how it could be improved.

Having already done the Deep Learning Specialization, I was disappointed that this course did not build on that as a foundation. There is a lot of overlap between this course and course 5 of the DL Specialization, Sequence Models. To me, it would have made more sense to make that course a prerequisite, thus avoiding all the duplicated material and instead going beyond it.

The other area where there's some room for improvement, though it's not at all bad, is the teaching style, which is mostly reading from a script. I would like to see more effort to engage with the learner and think about what they might need to progress on their learning journey. For example, discuss the strengths and weaknesses of an approach and where it fits into the history and state of the art of the subject; anticipate questions or likely misunderstandings and try to cover them or point to supplementary material.

Overall, a good course that could have been a great course.

创建者 D. R

Mar 22, 2021

I'm a master/graduate student who took an NLP course in Uni.

I think that overall this is a very a good introduction to the topic. Some concepts are really well explained - in a simple manner and with a lot of jupyter-lab code to experiment with.

In general in this specialization - the first 3 courses are good. There are some quirks (e.g. why Lukas is needed at all? He doesn't really teaches, just passes it on to Younes) but nevertheless I learned from it. And I think they have good value in them.

The 4th one, however, is completely disappointing. First 2 "weeks" are confusing, not really well explained, but somewhat "bearable". The last 2 weeks are complete sham. They claim to teach "BERT" and "T5" but don't really give any value. You're better off going elsewhere to learn these concepts.

If it wasn't for this, I would give the overall experience a 5 stars, but because of this, I think the overall is more like 3 or 4.