The course is very good and has taught me the all the important concepts required to build a sequence model. The assignments are also very neatly and precisely designed for the real world application.
The lectures covers lots of SOTA deep learning algorithms and the lectures are well-designed and easy to understand. The programming assignment is really good to enhance the understanding of lectures.
创建者 Vikram R•
This course is almost as good as the prior four, but some of the lectures lack detail, there are mistakes in some quizzes, and the programming assignments at times are crammed too full of information. You can end up passing through this class without really understanding what's going on, whereas the CNN class does a much better job of forcing you to understand things before you pass.
创建者 Daniel Z•
Excellent lecture content.
Some of the programming assignments are quite poor. Sometimes there are minor mistakes in function descriptions, and other times the whole assignment architecture/plan is not well thought out. If the staff doesn't have resources to improve this, then allow the community to create branches and submit merge requests :)
Overall, I'm happy with this course.
创建者 Paulo V•
The lectures were great, making an advanced subject accessible. The course materials were mostly good -- the exception being the optional (non-graded) assignment in Week 1, which was not well-structured, and failed to reinforce the concepts it was intended to. There were challenges with connectivity to the Jupyter notebook server, which caused much frustration and wasted time.
创建者 Christopher M•
Another great course by Prof. Ng. The reason for 4 stars is that I found the assignments to gloss over a lot of new Keras ideas (for Keras beginners) at the expense of spending more time on how the ideas were being implemented. I think the course should be spread out over more weeks, say 5, and spend the extra time going into more depth around the Keras model architectures.
创建者 Frank H•
In the lecture videos there have been quite a few repetitions and in the programming exercises the necessary Keras background has not been delivered. For this I have to subtract one star.
The course's contents are very inspiring, challenging and interesting at the same time. I'm really looking forward to applying the techniques learned so far to problems in my business life.
创建者 Nicolás A•
The course could have covered topics like time-series modeling for prediction (sales, demand, a machine failure in a factory, etc) that is much more applicable than some of the assignments proposed here (half of them seemed to be just for fun). Also, I am a little dissapointed that the course didn't cover chatbots, which is one of the most widely used applications for RNNs.
创建者 Dawar H•
The course was nice but more mathematics could be taught in the lectures, especially backpropagation in recurrent network. Also I feel there could be one more week in this course where recent models like Transformers and BERT can be taught. Overall a nice course to get familiar with Word Embeddings, LSTM, GRU, and some other topics like Translation and Speech Recognition.
创建者 Edward C•
The discussion felt really complicated at points. Also I was disappointed not to be able to complete the optional assignment for LSTM back propagation. Since it is ungraded, it would have been nice to at least see the correct implementation to learn from. Also there were several errors in the expected values or instructions in the assignments, that were really confusing.
创建者 Shringar K•
The instructor Andrew Sir is excellent in conveying topics, but I just found the last part a bit dry compared to the previous 4.
And the course was a bit too long, even though it said 3 weeks.
But the hands on programming practices in this course, especially is second to none. Top Notch.
One would need to revisit and do it all over again to make it stay inside your head.
创建者 Karl M•
Ths course really shows cutting edge technology such as using deep networks consisting of LSTMs, GRUs etc.. I especially liked the audio trigger word recognition.
The translation with attention exercise is really much harder to understand than any other exercise from that specialization. I admit I have managed to implement it more using intuition than real understanding.
创建者 P M K•
It has been quite a good course to explain the tedious concepts of RNN.
The only reason for a 4 star is there is definitely quite some room to improve upon the content and quality to bring it up to the mark of the previous 4 courses. There are quite a few bugs in the assignments which need to be rectified for the benefit of everyone, hope that it shall be done soon!
创建者 Matt C•
Concur with other reviewers: this class was good, covering a lot of interesting material and with well-structured quizzes & assignments. But the lectures seemed to skip past the sorts of in-depth explanations I wanted, instead just getting to the end point of "this is what this looks like". So good, but not quite as good as previous courses in the specialization.
创建者 Shikhar C•
This course is great to get intuitive understanding of Word Embeddings, RNNs, LSTMs, GRUs and Attention Models.
You will have great explainer videos and some excellent programming exercises. The course does not make you an expert, but it does make you familiar with the above mentioned architectures, so you can independently code and try them on your own solutions.
创建者 Duncan K M•
Really cool applications to work on, but the videos got a little too much into specific applications that may not be relevant most of the time. It was all interesting, but it made this course a lot longer each week. I could have done without a lot of the specifics of certain applications, just because it will be hard to apply/remember the concepts anyways.
创建者 Eric F•
All courses in this specialization are awesome. However, this last course feels a little rushed in comparison with the other 4 courses. While the first 3 courses raise your knowledge of ANN in preparation to the 4th one, it is a little more difficult to understand this 5th course. Likewise, completing the assignments is possible, but more frustrating.
创建者 Jörg J•
Guys, just the truth: Content: Great. Mr. Ng: Great. Autograder: Complete and utter BS. If you rework the Infrastructure you will be big. If you further refuse to do so (literally thousands of complaints about the autograder in the forums -> nothing happens) you will not. Check out Scala courses approach with grading -> works like a charm. Cheers, JJ
创建者 Robert P•
The content is generally great and well worth it. I wish they would fix some of the errors, especially in ungraded exercises. You end up wasting a lot of time because of them. Perhaps the most frustrating aspect is navigating to the Jupyter notebooks. I wish the links to the notebooks were on the same pages as the Submission and Discussion links.
创建者 Sung W K•
I learned a lot. I would give 5 starts but the jupyter notebooks were very very buggy. I spent half of my time on the homework going through the forums to find workarounds. It took away from learning the material efficiently.
Note that I think that this may be a temporary problem as a new platform was release Jan 2019. The content was terrific.
创建者 Elena B•
The course is very interesting and it gives an insight into recurrent neural networks (RNN). The practical exercises are interesting but I found them in a bit raw state compared to the previous courses of the Deep Learning Specialization. Nevertheless I would still highly recommend to follow this course. Thanks a lot to organizers.
创建者 Jungwon K•
Everything seems logical, except the programming assignments. Although I went through week 1 programming assignments only, I often had to face some problems with insufficient information. Lecture videos are easy to understand, but not all the details are explained. (This is the point where I need to find some information by hand.)
Videos are great; but as usual TP are too guided (hence boring) and do not use today frameworks (Pytorch, tensorflow 2). TPs should either be completely coded by candidates (only introduction + resfresh on concepts + objectives) with evaluation on final accuracy/f1 score <or> they should be no TPs at all and more MCQ tests
创建者 Charles B•
Content her is great - the first week covers the basic RNN models in a very clear way and the assignments are interactive and interesting, building on the explanations in lectures. One downsides is that the production quality is poor and would benefit from some re-recording to remove bloopers and make it smoother to watch.
创建者 Chinmay P•
I wish it was a bit more interesting. It also kinda feels like Andrew has a bit of a problem himself in understanding the paradigms stated in this course, and that makes me feel somewhat confused as well. Would recommend for the math, the notations are weird and confusing sometimes but it is understandable for most parts.
创建者 Artem M•
This is a very interesting course with good explanations, which give a brief but sufficient introduction to sequential models like GRU and LSTM. One star is dropped because the CNN course (#4) is still better than this one in terms of explanations, while course #2 is better in terms of relevant material and pace (to me).
创建者 Pascal P Z Z•
Although I really really really love this series and although I always gave 5 stars, I think the quality of this last module is a lot less better than the previous ones. I think convolution was way more difficult but the explanation was awesome. Unfortunately, i think explanations in this module are a little sloppy.