Chevron Left
返回到 Sequence Models

学生对 deeplearning.ai 提供的 Sequence Models 的评价和反馈

4.8
25,106 个评分
2,943 条评论

课程概述

This course will teach you how to build models for natural language, audio, and other sequence data. Thanks to deep learning, sequence algorithms are working far better than just two years ago, and this is enabling numerous exciting applications in speech recognition, music synthesis, chatbots, machine translation, natural language understanding, and many others. You will: - Understand how to build and train Recurrent Neural Networks (RNNs), and commonly-used variants such as GRUs and LSTMs. - Be able to apply sequence models to natural language problems, including text synthesis. - Be able to apply sequence models to audio applications, including speech recognition and music synthesis. This is the fifth and final course of the Deep Learning Specialization. deeplearning.ai is also partnering with the NVIDIA Deep Learning Institute (DLI) in Course 5, Sequence Models, to provide a programming assignment on Machine Translation with deep learning. You will have the opportunity to build a deep learning project with cutting-edge, industry-relevant content....

热门审阅

AM

Jul 01, 2019

The course is very good and has taught me the all the important concepts required to build a sequence model. The assignments are also very neatly and precisely designed for the real world application.

WK

Mar 14, 2018

I was really happy because I could learn deep learning from Andrew Ng.\n\nThe lectures were fantastic and amazing.\n\nI was able to catch really important concepts of sequence models.\n\nThanks a lot!

筛选依据:

2826 - Sequence Models 的 2850 个评论(共 2,915 个)

创建者 Juan I G P

Mar 13, 2019

Not as good as structured in explanation nor in programming assigments as the last ones.

创建者 Mohsin J

Jun 07, 2018

This is only good enough, not good course. All previous ones were 5 stars, definitely!

创建者 Thomas P

Apr 09, 2020

Overall a great class. I had some trouble understanding the programming assignments.

创建者 Archana A

Oct 07, 2019

This felt the the least prepared and organized course of the series, unfortunately.

创建者 Sebastien C

Aug 18, 2020

Good theoretical overview - project just require you to fill in lines of code

创建者 Samit H

Aug 18, 2020

I found this course boring and also too many assignments in a single week.

创建者 Tushar B

Jun 12, 2018

Issues with assignments. Took more than 4 hours to figure out the problem.

创建者 Saeif A

Jan 03, 2020

This was the least clear course among the others. The others were great!

创建者 Ragav S

Sep 18, 2019

Would like to learn a bit on how back-prop works when using attention.

创建者 Gaetan J d B

Jun 17, 2019

fairly more complex and deeper as previous courses. Nice ex. however.

创建者 Yun W

Apr 06, 2019

I feel this course is not as carefully designed as previous courses

创建者 mayukh m

Apr 16, 2020

Trigger word detection - v1.ipynb bug is annoying. Course is good.

创建者 yuichi k

Jul 27, 2020

ほぼ英語、プログラムの課題の問題を解決するのが非常に大変だった。bugも多いのでこなすのは苦労した。ビデオは相変わらず素晴らしい

创建者 Prateek S

Apr 22, 2020

Good Course but lectures and assignments could have been better.

创建者 bernd e

Mar 10, 2018

Should be five weeks instead of three. Dive deeper into Details

创建者 Ar-Em J L

Oct 30, 2019

One of the weaker courses in the specialization. Felt rushed.

创建者 Danilo G F R

Feb 06, 2018

Assigments too complicate without a necessary guide and help.

创建者 André T D S

Oct 02, 2018

Bugs in the programming assignments grading kills the flow

创建者 Rajarshi K

May 30, 2020

This course was comparatively boring than previous 4 ones

创建者 Santosh B

Feb 19, 2019

I felt the last week had too many things packed together

创建者 Shrishty C

Jul 06, 2018

Was little hard to understand at times. But it was good.

创建者 Konpat P

Feb 16, 2018

Not as well done as before. But, still very informative.

创建者 Edoardo B

Nov 15, 2019

Doesn't teach much about keras which is sorely needed

创建者 Rajesh R

Feb 25, 2018

GRUs are poorly explained. Unable to get past Week 1.

创建者 Kenzi L

Jul 19, 2020

a bit outdated due to lstm being not that s-o-a now