Chevron Left
返回到 Natural Language Processing with Sequence Models

学生对 提供的 Natural Language Processing with Sequence Models 的评价和反馈

892 个评分
178 条评论


In Course 3 of the Natural Language Processing Specialization, you will: a) Train a neural network with GLoVe word embeddings to perform sentiment analysis of tweets, b) Generate synthetic Shakespeare text using a Gated Recurrent Unit (GRU) language model, c) Train a recurrent neural network to perform named entity recognition (NER) using LSTMs with linear layers, and d) Use so-called ‘Siamese’ LSTM models to compare questions in a corpus and identify those that are worded differently but have the same meaning. By the end of this Specialization, you will have designed NLP applications that perform question-answering and sentiment analysis, created tools to translate languages and summarize text, and even built a chatbot! This Specialization is designed and taught by two experts in NLP, machine learning, and deep learning. Younes Bensouda Mourri is an Instructor of AI at Stanford University who also helped build the Deep Learning Specialization. Łukasz Kaiser is a Staff Research Scientist at Google Brain and the co-author of Tensorflow, the Tensor2Tensor and Trax libraries, and the Transformer paper....



Sep 27, 2020

Overall it was great a course. A little bit weak in theory. I think for practical purposes whatever was sufficient. The detection of Question duplication was a very much cool model. I enjoy it a lot.


Nov 11, 2021

This is the third course of NLP Specialization. This was a great course and the instructors was amazing. I really learned and understand everything they thought like LSTM, GRU, Siamese Networks etc.


176 - Natural Language Processing with Sequence Models 的 186 个评论(共 186 个)

创建者 Miguel Á C T

Mar 19, 2021

The course is good as an example of code that executes tasks correctly; that is, you can see how neural networks are defined and used in Trax. However, from a pedagogical point of view, I find it quite weak. Concepts are poorly explained and notebooks consist of little more than copying and pasting previously displayed code.

创建者 George L

Mar 21, 2021

Compared with the Deep Learning specialization, this specialization was designed in a way that nobody can understand. Although the assignment could be easy at times, the point is being missed when people cannot really understand and learn. Bad teacher. Andrew Ng, please!

创建者 Youran W

Dec 4, 2020

All the assignments are extremely similar.

创建者 Xinlong L

Aug 22, 2021

I did not enjoy the course at all. It looks like the instructor is just reading materials rather than really teaching. He just focused on reading and did not explain anything. I took Andrew's deep learning specialization, and that course was really great. But I am so disappointed at this course. please do strict quality control on the courses otherwise it harms your brand

创建者 Yanting H

Oct 13, 2020

Oversimplified illustration of all core definitions and it is not reasonable from any sense to use trax instead of a popular framework like Tensorflow or Pytorch for the assignment. Also, the design of assignment is weak, you can barely learn anything from filling the blanks.

创建者 Ngacim

Nov 27, 2020

1) The course videos just throw out various nouns and you need to goole them to understand what they mean.

2) The assignments try their best to explain concepts in a way that often seems redundant.

创建者 Emanuel D

Jan 26, 2021

For me, it is very dissapointing, time is spent on irrelevant things, like python syntax and generators in first week. There are missing video tutorials on how to use Trax.

创建者 Siddharth S

Sep 19, 2021

Hard to follow explanations and TRAX absolutely made it super hard to learn and follow.

创建者 Alistair M

Feb 19, 2022

Superficial descriptions of the topics; quality definitely lacking

创建者 Alice M

Nov 15, 2020

No mentors were available or contactable during this course

创建者 Nicolás E C R

Dec 23, 2020

very superficial