The course is very good and has taught me the all the important concepts required to build a sequence model. The assignments are also very neatly and precisely designed for the real world application.
Very good. I have no complaints. I though instruction was very clear. Assignments were very helpful and challenging enough that I learned something, but not so challenging that I got stuck too often.
创建者 Evan J•
Week 4 is impossible to follow both in lecture and in the assignements which is more than likely done intentionally as this is the last assignment before you can stop paying their monthly fee.
创建者 Moses O•
The unit tests in the programming assignments are poorly implemented. They will fail you if your code is not exactly as expected, even when it runs and returns the correct output.
创建者 Saksham G•
TensorFlow and Keras basics are not covered. The course states no pre-requisites as well. This was really disappointing.
创建者 Harshvardhan B•
Not as good as other 4 courses of the specialization
创建者 Yuri C•
What to say after spending this whole weeks in the digital company of Andrew? :) Well, I have to words but to thank Andrew and the team for making the effort and putting all this together and pulling it off the way they did! I did already the NLP Specialization of DeepLearning.AI and wanted to learn more about the inner workings of neural nets and I was not a single minute disappointed by my decision! Extremely well done overall! In particular the course on Sequence models is very enjoyable and will give you all the tools and intuition necessary to understand the background and the basis of such models and how they work! You even have a last week on Attention-based models, which is a little bit introductory, given that Attention as a concept exploded after this course was developed. Nevertheless, it is very well executed! My only critique is that, the whole specialization was developed before TensorFlow 2.0, and later on it the assignments were ported and Keras is used. But during the lectures on will not find any introduction to Keras and all this content is left for the assignments. This requires from the student more effort to understand and use Keras. What can be a bit frustrating, because Keras API also evolved during this time. I would suggest adding a couple of videos or ungraded labs in order to teach the student a bit better how to use the framework. But well, this is a minor issue. 5 Stars are for the overall amazing presentation of what is indeed important, the Deep Learning fundamentals. Moreover, Andrews last video is just very much aspiring and a powerful message that everyone in the field should come in contact with! Congrats! I hope the team plan to execute soon an updated version of the specialization in order to incorporate the new advances and to adapt it to the newest DL Framework. Apart from that the theory videos are 5 out of 5. Clear recommendation!
创建者 Alejandro A•
A year ago I was basically "on blank" in regards of Machine Learning.
I've started "my journey" on ML about 9 months ago, with a text book I've got on Amazon called "Data Mining, Practical Machine Learning Tools and Techniques", Self taught I've read, transcribed, done some math, covered the half of it. But I needed something more practical to speed up, so I've tried also with the coursesfrom "Super Data Science"'s team on Udemy, but found them to be too focused on practice rather than deep reasoning of it (I might be wrong but that's the impression I had); So I needed more formal, University-like.
I've decided to try out Andrew's first course on Machine Learning (with Matlab), which gave me much greater view and understanding, had my head melting specially on weeks 4-6, but after finishing the course I've felt I did finally know what ML was! but still there was "a lot missing", given the course was already a bit old, and the technology had developed greatly since then.
Fortunately to me, I've found out about this specialisation right after I've finished the first course and I've signed up immediately. Today (14.4.2018) I've finished the second specialisation. After 6 months of continuos dedication, doing the first 3 month course, plus this 3 month specialisation.
Homeworks in Matlab and Python were my next challenge, even I'm a developer for 15 years (C# / Java, C). Combining a lot of new theory in a new language made it harder but also satisfying.
I'm the kind of person that needs to understand why things work as they work, that might be my weakness but also my strength; It's not enough for me to drive the car, but I need to know how to tune it. I must tell that for example, a video/lecture of 15 minutes meant to me usually 60 minutes of work, transcribing, doing the math, etc. That made my 6 months particularly long..
创建者 Artem B•
This is again a fantastic course and what a nice way to finish the Deep Learning Specialization. It is certainly the most difficult one from the whole specialization and has taken me a lot longer than I planned. This is partially due to the fact that focus is shifted a bit more towards the programming assignments and concepts that are only briefly mentioned in the lectures turn out to be crucial for the assignments. The forum helps a lot, without it I would not have been able to crack the first week, especially the optional parts of the assignments. There were also a few errors in derivation formulas, that had set me back, but in the end I understood the concepts a lot better and found some nice complementary resources online. And the RNNs are more complex and seem more variable than other network architectures, so that is ok that this course is more difficult. Now I feel that I finally have a good grasp of Deep Learning concepts and have a nice set of skills. And the assignments are super fun and very useful. Thank you Andrew Ng and your team for making such a wonderful content. I teach at the university-level and I can only imagine how much effort goes into preparing such a course and at such a high level of expertise. I encourage everyone to take this specialization, this specialization is the main gem in Coursera, in my opinion.
创建者 John Y•
It is apparent how much thought and effort has been put into creating these courses. Dr. Ng introduces you to state-of-the-art CNN and Sequence models which are quite complex. But he expertly presents it to you so that you can focus on the essential aspects and not the details. In courses 1-3, you might feel like you're being spoon-fed in the assignments but it is really a great approach to ease you into the deep learning field. In courses 4 and 5, there is less guidance so that you can become more independent and be able to figure things out on your own. After all, this is how it will be in our future jobs - no more TA's then.
One thing I really appreciated in this specialization was the use of good notation. For me this was very important because it made it easier to apply theory into practice (via the assignments). Another thing is the amazing selection CNN and sequence model topics that were covered. Because of this, I now have a good idea where to focus my future projects/work. I also loved the assignments because they helped me understand the concepts much better.
For future students, please note that there are mini tutorials for Python (in Course 1), TensorFlow (in Course 2), and Keras (in Course 4). Keras is used a lot in Course 5 but there is no Keras tutorial in that course.
创建者 Damon L•
From my perspective as a learner, there're two biggest"disaster" they may encounter:
1. A boring lesson to take.
2. Tuns of confusing question hanging in mind with no way out.
Coursera prevents those two biggest obstacles of learning from happening. Here's my experience:
1. During taking the DLS Courses lessons, I burst to laugh more than some times.
The laugh comes from the joy of meeting something interesting, from the surprise of finding something very powerful, from the happiness of mastering something that can create everything, which is like a pencil to draw the boundaryless dreamy beauty world, a piano to compose infinite wonderful music, a magic to perform the impossible miracle.
2. The discourse forum is turning the potential disaster of questions into the joy of exploring and gain of problem solving.
First, mentors response questions fast.
Second, they analyze questions very patiently and carefully, no matter it seems big or small.
Last but not the least, they encourage a lot, which cultivates a free environment to ask, to explore, to experiment, and to share.
Here I give all my credit to the lesson team and the discourse team, they're the best dancing partner to each other. And now we learners are joining this wonderful dancing floor! There surely is lots of joy and gain.
创建者 Maksym P•
I really enjoyed the course. As usually Andrew and his team of dedicated professionals did a wonderful job of explaining an otherwise very hard material in an accessible way. The distinction of Andrew's classes is that they really give the *intuition* about why a particular approach works. Sure I may forget which particular regularization methods exist, but I will remember *why* and *when* to use regularization. The details can be always looked up elsewhere.
I can't imagine how much effort it took to create high quality slides, transcripts and WELL-DOCUMENTED CODE(!) in the notebooks. Being a software engineer, I can't stress the importance of a good documentation enough.
Since the notebooks already propose a well-designed NN architecture which gets the job done, what I'd like to see is maybe some reasoning about why *this* particular design was chosen, and not some other one. There are some explanations already, but even more explanations would not hurt :)
That said, it is an amazing course, so I can't recommend it enough! Thank you!
创建者 Shibhikkiran D•
First of all, I thank Professor Andrew Ng for offering this high quality "Deep Learning" specialization. This specialization helped me overall to gain a solid fundamentals and strong intuition about building blocks of Neural Networks. I'm looking forward to have a next level course on top of this track. Thanks again, Sir!
I strongly recommend this specialization for anyone who wish get their hands dirty and wants to understand what really happens under the hood of Neural networks with some curiosity.
Some of the key factors that differentiate this specialization from other specialization course:
1. Concepts are laid from ground up (i.e you to got to build models using basic numpy/pandas/python and then all the way up using tensorflow and keras etc)
2. Programming Assignments at end of each week on every course.
3. Reference to influential research papers on each topics and guidance provided to study those articles.
4. Motivation talks from few great leaders and scientist from Deep Learning field/community.
创建者 Justin H•
This review applies to all of the courses in the Deep Learning Specialization. First, I want to thank Professor Ng so much!!! This Deep Learning Specialization was fantastic!! I feel more proud after completing this than I did after finishing the CPA exam!
I took Professor Ng's Machine Learning course as a prerequisite, which I would recommend to everyone before diving into the Deep Learning Specialization. The switch from Octave to Python can be a little tricky, but stick with it. Octave allows you to gain a deeper understanding of the Linear Algebra aspects and matrix multiplication than Python does (for me it did anyway).
The entire line up of courses prepares you so well to develop an eye for deep learning use cases and gives you the skills necessary to dive in and start applying deep learning solutions to real world scenarios.
I'm so proud to have completed this specialization and I cannot wait to start building my own models and come up with ideas to benefit society! :D
创建者 Kevin M•
A terrific set of courses that builds deep learning skills in neural networks. The course guides the student through various time based models to address how speech recognition, music generation, sentiment classification, machine translation, video activity and name entity recognition.
The journey includes Recurrent Neural Networks (RNN), Language Models and Sequence Generation for NLP tasks, Gated Recurrent Unit (GRU), Long Short Term Memory (LSTM), Bi-directional (BRNN), Deep RNNs, Word embedding for NLP, analogies, GloVe, Sentiment, and de-biasing. The final week includes Sequence Models with Attention, BEAM search, BLEU Score, Speech Recognition, and finally trigger word detection.
The course takes works, attention to detail, patience with the programming exercises, and diligence in completing the videos, quizzes, and coding work. Highly recommend this course for the intermediate level ML practitioner that has Python backgrounds and wants to get a TensorFlow and Keras introduction
创建者 Cyrille K•
Dear Prof. Andrew,
it is with great gratitude that I leave you this message. After following your Deep Learning specialization, I have finally reached the level that will allow me to reach my goals in my projects, something I thought complex to do in 5 years but I did it in a 2 month interval. Your specialization in Deep learning is in my opinion the raw material to explode in AI. Each one of your 5 courses is like the meal that you never end even if you eat it all your life. I hope I'm not the only one of your students who has this enthusiasm, however you have already received many testimonials about your courses on coursera of which you are a Founder. Thank you so much for giving me a meal whose appetite never ends, thank you for giving me 80% of the subjects that are my goals. Thank you for Coursera. Every time I start watching one of your videos in the course, I want to stay there for as long as possible, thank you for making me love AI again and again. May God bless you infinitely
In the beginning, I found the instructor a little difficult to understand, even though he is very good at explaining complicated concepts simply. I am sure part of the reason is that I was unfamiliar with the technical terms. Once I switched on the captioning option, my comprehension improved however I noticed an average of at least one translation error per video and these seemed to be caused by the instructor's accent and were sometimes very interesting errors. So, I guess the system could use a little more training with the specific AI vocabulary and/or adjusting the context error settings for the subject matter.
However, once I had the captioning on, it was harder to follow the notes because sometimes the important information was right under the captions. What was really helpful was when he summarized with typed versions for two reasons. One, it was clearer to read and understand. Second, it was higher on the screen and did not overlap with the captioning.
创建者 Ryan M•
This is definitely a top-flight course and supremely useful! I learned many new things about practical applications of recurrent neural networks in this class and found the natural language emphasis to be very useful, particularly for certain problems I have been working on for some time! Professor Ng's lectures are very well-organized and clear and follow a very logical sequence. The assignments, especially the programming assignments, are well designed and do a very good job of building upon what is taught in the lectures and add a great deal of value to this class. I especially like the fact that we worked so much with Keras, which is an important framework for building Deep Learning systems and which is so widely used (it is the framework I often use in my own projects), and I acquired a lot of new knowledge about Keras thanks to this course. Overall, it was a superb learning experience, and I will be recommending this to both friends and colleagues.
创建者 Sean O•
Good set of courses on Deep Learning. Some small complaints / recommendations:
- Courses don't teach enough Keras & Tensorflow syntax to be completely stand-alone. If you take this course, you won't really be able to build your own DNN's unless you also take a separate Keras / Tensorflow course.
- Links to Keras documentation are broken -- they now take you to the general Keras homepage, not the specific command's page.
- In later courses, Andrew Ng's lectures are not edited. Starting around the 4th course, you start hearing Dr. Ng stop and repeat portions of the lecture, presumably intending the first attempt to be edited out in the future. Usually this is easy to ignore, but in some cases he repeats 30-60 seconds of lecture, which can be confusing.
- In the last course (sequence models), the text captions of Dr. Ng's lecture have a lot of mistakes, which is a little ironic for a course on speech-to-text
创建者 Diego M•
During the past couple of months, I worked on this Deep Learning Course Specialization Course by deeplearning.ai (through Coursera). I think is a great course for everyone that is interested in learning more about this topic and not only the theoretical aspects but also from a practical point of view. Andrew NG does an excellent work by going through the theory and then leaving some time for the practical exercises which are the best and, at the same time, the most challenging part of this specialization. These exercises start with very basic stuff but quickly turn into interesting problems related to convolutional neural networks, face recognition and end up with sequence algorithms for natural language processing.
If you are interested in building your own NN algorithms, learning about Keras and TensorFlow and spend some time working on applied exercises then I would recommend you this course!
创建者 Zeyad O•
I'm Zeyad, an undergraduate of Computer Engineering at Alexandria University in Egypt.
Taking this course really helped me to learn and study this field and also to implement it. It helped me advance in my knowledge. This course helped me defining Deep Learning field, understanding how Deep Learning could potentially impact our business and industry to write a thought leadership piece regarding use cases and industry potential of Machine Learning.
This specialization helped me identifying which aspects of Deep Learning field seem most important and relevant to us, apparently they were all important to us. Walking away with a strong foundation in where Deep Learning is going, what it does, and how to prepare for it.
Deep Learning specialization helped me achieving a good learning and knowledge about that field.
Thank you so much for offering such wonderful piece of art.
创建者 Taras P•
It was an amazing course. From the beginning to the end, Andrew Ng has laid out all of the parts of the course extremely well. Of course, given the nature of RNNs and their complexity, it will also take your effort to make sure that you understand what he is talking about. Another note about the assignments, previous reviews have mentioned some of the problems and how the previous courses had better structured assignments. I think that the deeplearning.ai team has done a tremendous job of improving the content of this course assignments. At moments, it feels like you are lost, but deep explanations make sure that you understand everything and are able to implements all of the parts of the system that you have to implement. Please take this course!
创建者 Arpad H•
I like the way Andrew introduce the topic. From the easier cases to the more difficult ones.
It would be better to use @ instead of np.dot. I like it better.
It would be nice to have a simpler method to download the notebooks with all the datasets, images, helper modules. And also to have a description what does one assignment needs to run on my own computer.
Thanks for the possibility to learn deep learning with Python. I am curious whether Julia, that is a kind of mixture of Python and MATLAB with parallel computing, will gain popularity.
As a Linux desktop user the attached pptx files are sometimes hard to read. There is no PPT just LibreOffice on my laptop. I preferred the Machine Learning courses PDF files. But the notebooks are great.
创建者 Glenn B•
Great topics and discussion, however the lectures started to gloss over the details of implementation which were left entirely to the exercises.
Started to get the basic hang of Tensorflow and Keras by this point in the series, however it was a bit of cut and paste from previous exercises, thus still requiring a lot of forum review to sort out syntax issues.
I get the dynamic aspect of writing the lecture notes in the videos, however the lecture notes should be "cleaned up" in the downloadable files (i.e., typos corrected and typed up). Additionally, the notes written in the video could be written and organized more clearly (e.g., uniform directional flow across the page/screen rather than randomly fit wherever on the page.
创建者 Nishant M K•
Great introduction to sequence models! Andrew as usual goes into detail on some seminal architectures that have shaped deep learning sequence models over the past ~decade. One feedback: I wish there was a week dedicated to doing just backprop and gradient computations for plain RNNs as well as LSTM-cell based RNNs. The latter is covered to some degree in the programming assignment as an ungraded part, but I don't think that is enough justice to the topic. Also, another feedback: sometimes keeping track of the dimensions of the various entities in play was very difficult for me. Perhaps 10-15 minutes dedicated to explaining just that would be helpful to students. All in all, a fantastic course though!
创建者 Adrian N K•
It was an unbelievable journey through this Deep Learning Specialization! I really felt the power of the tools I obtained during the past 3 weeks that it took me to pass all 5 courses of the specialization. Many of the Programming Assignments are demanding and in the end I could be extremely satisfied that I succeeded in taking them all. Thanks a lot to Andrew Ng and all involved for making this sequence of courses accessible to people like me, and presenting it in such an understandable and interesting way! Now, I can start thinking of the vast potential for using Deep Neural Networks not only in Research and Space Sciences, where my interests are, but also in my daily life. Very many thanks again! AJ
创建者 Maurice M•
The whole series was excellent but in particular this last course on RNNs. Thank you for not skipping the mathematical details and letting us figure out backpropagations through time and how Adam works under the hood and explaining LSTMs and Attention so well. There was even a notebook on Attention! And the dinosaurus notebook was cool but the jazz improvisation really blew me away: the music actually sounded really nice! :) Also, thank you for pre-training the models to safe us time and teach us how to resume training from learned weights! The quizzes were helpful in developing an intuition and the price point was more than fair. Perfect series, Andrew, thanks a lot!