Chevron Left
返回到 Sequence Models

学生对 deeplearning.ai 提供的 Sequence Models 的评价和反馈

4.8
26,488 个评分
3,125 条评论

课程概述

In the fifth course of the Deep Learning Specialization, you will become familiar with NLP models and their exciting applications such as speech recognition, music synthesis, chatbots, machine translation, natural language understanding, and more that have become possible with the evolution of sequence algorithms thanks to deep learning. By the end, you will be able to build and train Recurrent Neural Networks and commonly-used variants such as GRUs and LSTMs; apply RNNs to Character-level Language Modeling; gain experience with natural language processing and Word Embeddings; and use HuggingFace tokenizers and transformer models to solve different NLP tasks such as NER and Question Answering. DeepLearning.AI is proud to partner with NVIDIA Deep Learning Institute (DLI) to provide a programming assignment on Machine Translation with Deep Learning. Get an opportunity to build a deep learning project with leading-edge techniques using industry-relevant use cases. The Deep Learning Specialization is our foundational program that will help you understand the capabilities, challenges, and consequences of deep learning and prepare you to participate in the development of leading-edge AI technology. It provides a pathway for you to gain the knowledge and skills to apply machine learning to your work, level up your technical career, and take the definitive step in the world of AI....

热门审阅

AM
Jun 30, 2019

The course is very good and has taught me the all the important concepts required to build a sequence model. The assignments are also very neatly and precisely designed for the real world application.

JY
Oct 29, 2018

The lectures covers lots of SOTA deep learning algorithms and the lectures are well-designed and easy to understand. The programming assignment is really good to enhance the understanding of lectures.

筛选依据:

2326 - Sequence Models 的 2350 个评论(共 3,095 个)

创建者 王世超

Mar 31, 2018

good

创建者 Jinjun S

Mar 25, 2018

good

创建者 Lie C

Mar 4, 2018

nice

创建者 Xiujia Y

Mar 2, 2018

good

创建者 Tất T V

Feb 11, 2018

good

创建者 Han C

Feb 6, 2018

Good

创建者 华卓隽

May 13, 2019

666

创建者 莫毅啸

Aug 3, 2018

ths

创建者 黄家鸿

Jun 12, 2018

非常好

创建者 雷后超

Apr 20, 2018

666

创建者 Sylvain D

Feb 12, 2018

top

创建者 Mohamed M

Sep 27, 2020

<3

创建者 Parth S

Jan 3, 2020

kk

创建者 林韋銘

Aug 26, 2019

gj

创建者 Pham X V

Nov 6, 2018

:

)

创建者 srikanta p

Apr 15, 2021

A

创建者 Abdou L D

Jul 15, 2020

-

创建者 jainil k

Aug 11, 2019

-

创建者 Musa A

Jul 9, 2019

A

创建者 郑毅腾

May 14, 2018

i

创建者 wangdawei

Mar 30, 2018

创建者 Mathias S

Apr 22, 2018

The Sequence Models course was the one I sought out in the deep learning specialization. Very interesting assignments, e.g. neural machine translation, music composition, etc. - much more interesting than the convolutional network models, in my opinion. However, it is also much more difficult to follow; probably the most difficult one of the five courses.

Prof. Ng did a wonderful job in the delivering the materials, as always. However, I expected a lot more details about the sequence models, and recurrent networks as much as the ones given in the previous courses. I was looking forward to learn more in-depth about this model, but I didn't feel I get all that I wanted. For example, I wish there an example, step-by-step walkthrough of the backpropagation through time (BPTT) algorithm, especially for the LSTM and GRU models.

The assignments were a little more difficult to follow, I think. To me, the instructions were not as clear as the previous courses (in my opinion), especially when using Keras objects/layers - "use this *object/layer*" but it wasn't clear whether or not to fiddle with the arguments. Usually when it does require a specific value for the argument (e.g. axis=x), it will be mentioned either in the text or code comments. I guess it's a good challenge, but I find myself doing more trial-and-error with the coding to get it to work instead of having some guidance on how to use those Keras objects/layers. The discussion forums do help, however. Lastly, some of the assignments involved building a recurrent model using Keras layers, I felt like there was not enough explanation why such architecture, layers, or hyperparameter values were chosen.

Overall, I liked the course, I did learn a lot from the course, and enjoyed the models we get to play with in the assignments. I think I will still run into problems trying to devise my own sequence models, and fumble with Keras. I wish there is a more in-depth course on the sequence model. Prof. Ng's delivery was excellent; I enjoyed listening to every one of his lectures (even at 2x speed) :)

Thank you to Prof. Ng, and all the people who worked hard to develop the course.

创建者 D. R

Oct 1, 2019

(09/2019)

Overall the courses in the specialization are great and provide great introduction to these topics, as well as practical experience. Many topics are explained clearly, with valuable field practitioners insight, and you are given quizzes and code-exercises that help deepen the understanding of how to implement the concepts in the videos. I would recommend to take them after the initial Andrew Ng ML course by Stanford, unless you have prior background in this topic.

There are a few shortbacks:

1 - the video editing is poor and sloppy. Its not too bad, but it’s sometimes can be a bit annoying.

2 - most of the exercises are too easy, and are almost copy-paste. I need to go over them and create variations of them in-order to strengthen my practical skills. Some exercises are quite challenging though (especially in course 4 and 5), and I need to go over them just to really nail them down, as things scale up quickly. Course 3 has no exercises as its more theoretical. Some exercises have bugs - so make sure to look at the discussion board for tips (the final exercise has a huge bug that was super annoying).

3 - there are no summary readings - you have to (re)watch the videos in order to check something, which is annoying. This is partially solved because the exercises themselves usually hold a lot of (textual) summary, with equations.

4 - the 3rd course was a bit less interesting in my opinion, but I did learn some stuff from it. So in the end it’s worth it.

5 - Slide graphics and Andrew handwriting could be improved.

6 - the online Coursera Jupyter notebook environment was a bit slow, and sometimes get stuck.

Again overall - highly recommended

创建者 Kayne A D

Mar 6, 2020

Note: this review also applies to the specialization as a whole. I thoroughly enjoyed the courses and learnt so much. The content delivery was excellent. I am not sure how unfeasible it was to re-produce the videos, though it would have been nice to see fewer corrections. In regards to the programming assignments, I think that they are great overall. The way the code was mostly pre-filled to ensure logical workflow helped me to stay on track rather than trying to skip straight to the final model (I assume beginner programmers do this a lot). It provided a great template for thought and I will definitely be referring back regularly. However, I also think that descriptions and exercises could be integrated better, especially in the later assignments where students are compiling more complex models and prior skills in programming (from earlier courses) are expected. Specifically, there are certain Exercise descriptions with multiple parts 1a,b,c 2a,b,c etc and hints before you even get to the start of the actual implementation. I think this is inefficient and was a bit frustrating at times. It felt a bit disjointed and overwhelming to read the instructions without no context (i.e. reading about step 2c before implementing step 1a). On the whole, I know that the knowledge, understanding and skills I obtained through this specialization will serve me extremely well throughout my PhD. Thank you very much.

创建者 Gary G

Mar 5, 2018

I enjoyed Prof Ng's excellent lectures, but felt the material moved too quickly. This 3-week course could easily have been extended to (say) 5 weeks to allow for more depth in covering the various RNNs, applications and model details.

The homework/programming assignments were more difficult and time-consuming than prior courses, particularly for implementing models with Keras. The structure of these programs was hard to understand (a bit of spaghetti-code, in my opinion). Some experience with Keras and tensorflow is essential. I spent a lot of time just trying to construct the programs with correct syntax, etc...while this is useful to know, I'd rather focus more on fundamentals of the learning algorithms.

However, its clear that a lot of effort went into constructing the programming exercises for this course, and they covered a lot of ground, with a bit more sophistication than the exercises from most of the earlier courses.