If you want to break into cutting-edge AI, this course will help you do so. Deep learning engineers are highly sought after, and mastering deep learning will give you numerous new career opportunities. Deep learning is also a new "superpower" that will let you build AI systems that just weren't possible a few years ago.
In this course, you will learn the foundations of deep learning. When you finish this class, you will:
- Understand the major technology trends driving Deep Learning
- Be able to build, train and apply fully connected deep neural networks
- Know how to implement efficient (vectorized) neural networks
- Understand the key parameters in a neural network's architecture
This course also teaches you how Deep Learning actually works, rather than presenting only a cursory or surface-level description. So after completing it, you will be able to apply deep learning to a your own applications. If you are looking for a job in AI, after this course you will also be able to answer basic interview questions.
This is the first course of the Deep Learning Specialization....

Jan 04, 2020

At first, I want to thank the course teacher and all the others for providing us such a wonderful course. The way the professor teaches is really very very helpful. Thank you all again and keep it up.

Jun 01, 2020

It's really quite an amazing course where we get to learn the mathematics behind the Neural Networks. It is great to learn such core basics which will help us further in developing our own algorithms.

筛选依据：

创建者 Long H N

•Dec 10, 2017

N/A

创建者 Amit P W

•Sep 30, 2018

Hello Andrew Ng Sir & Coursera Team,

Tell your instructors about yourself.

My name is Amit Wadhe. I am software engineer working in Walmart, Bangalore, India. I have 4 Years of working experience. Prior to Walmart I was working for Morgan Stanley. I have done my Bachelor of Technology in Computer Science and Engineering. I was always passionate about the computer from my school days. Out of curiosity I did my first C Language class in 10th Standard(School). That too with daily up-down of total 180km with train for one month from my hometown to Akola city. That time there was no computer courses offered in my hometown. After my schooling, I decided to go for engineering in Computer branch. I think that is enough in short about me.

Why did you take the course? How has it helped you?

I am working mainly on Java applications for last 4 years professionally. In last couple of years I realised that its not something which is exciting me, Its not something I wanted to work on. I was not sure what I wanted to work on, what excites me. I was hearing bits and pieces about Machine Learning and Artificial Intelligence since long from friends and colleagues. I was having perception about AI is that it's something big, something rocket science, something not for normal professional. But I got true trigger when I saw video about self driving car in silicon valley. That time I felt, Yes I wanted work on something like this, something which can be useful in real life, day to day life. I started searching about ML courses on google, I saw multiple courses on Udemy and Coursera. I red feedback about some courses. First place I started with some Udemy courses on ML for beginners but It comprised of only on how to code instead how it work internally. I was interested in knowing how something works internally instead of more in coding part. As I was Java developer I knew coding is not big deal. So I was curios about how ML models work internally, what is mathematics behind it, I was having interest in mathematics from the school days, though I did not score top. Then I started with ML by Andrew Ng on Coursera. After completing course, I felt like Yes, this is what I was looking for. Post completion my curiosity in deep learning has taken deep dive and I started looking for more courses by Andrew Ng on Deep Learning.

This course helped me to clear my understanding about how Neural network works mathematically. I was knowing bits and pieces about neural network steps like forward propagation and backward propagation but that was partial knowledge. After completing course I got that satiate feeling, Yes I know now, I understand it now in and all.

What did you love about the course? Tell them!

"I loved the bottom up approach of Andrew Ng Sir explaining concepts and Unveiling the treasure".

Irrespective of background I think anyone can understand the course with some knowledge on matrices and linear algebra. Recalling required knowledge learned in previous slides in short before diving into concept. Pace of course is also something which helps to grasp concept easily. Very intuitive examples helps to understand concepts faster. The example which I like most is about Neural network model of housing price prediction where Andrew Sir told intuition of hidden layers which is really connected to real life examples.

创建者 Sarah R

•Dec 29, 2018

This course was insanely clear and meticulously constructed. As someone who does data science work professionally, I so appreciated the thought that went into the design of the videos and the programming assignments. You are seeing really exemplary code and also really sophisticated use of the Jupyter notebook! Also, the test cases are so well-constructed. You really get to *see* all of this stuff working or not with the carefully designed helper functions that allow you to visualize the decision boundaries and view training examples. Of course, the writing of these helper functions is no small feat. IT WILL NOT BE LIKE THIS WHEN YOU CAST OUT ON YOUR OWN. But, what this course does for folks (like me) who didn't have the benefit of a course like this in their formal schooling (perhaps they are too old and this stuff only got well-organized and codified more recently) is provide exemplars. Will your code always look like this for everything you build? No. But it shows you, using the exact technology that you are likely to employ professionally (tensorflow is coming up in the next course), what is possible. I look forward to rest of the specialization.

A note on the pacing: Perhaps because I am already very familiar with python, numpy, and Jupyter notebooks, I was able to complete this course in about two days (rather less than 4 weeks). However, I still got a ton out of it. I think it is paced the way it is so as to be viewed as more accessible by everyone, and also not with the assumption that you want to dedicate the majority of a weekend to it. Probably also there is something to the psychology of completing it so very ahead of schedule that the designers of this specialization are not altogether unaware of. But, if you, like me, know that you want a refresher on neural nets that is going to be practical and useful, in that it will help you both implement them AND understand what you're doing, this is a quick and effective way to jump back in.

Finally, since this is such a quick course, I really recommend NOT skipping it, even if you want to get to the more advanced topics in the rest of the specialization quickly. The course is so thoughtfully designed and concepts are introduced in a very specific and intentional way to make sure you understand each step before the course progresses. Based on having experienced this careful design, I expect the notational and programming conventions established in this course will make the next courses in the specialization more accessible.

In conclusion, this is I think the best online course with integrated programming exercises I've ever taken. I think it might be a standard-bearer for the whole field. Well done!

创建者 Jeremy W G

•Apr 25, 2018

Copy&Paste from the survey I wrote earlier.

In 2012, I graduated with a statistics degree (BS) from the middle west where many companies hire data scientists to do simple analytics work. With my dream to do more predictive modeling work, I decided to go to the west coast and join the University of Washington to learn statistics in the master's program. One reason was that UW offered a great statistics program that most students chose to continue the Ph.D. program. The other reason was that Seattle had a few great high tech companies for me to explore opportunities at. However, although the MS program gave me a strong background in statistics theory, I found the industry moved so fast that my knowledge was falling behind the industry needs. In 2013-2014, I took Andrew's ML course on Youtube and Amazon hired me as a data scientist in the marketing department of Cloud Computing department (AWS). I figured that as a stats major I didn't have the knowledge in cloud computing or marketing, so in 2015 I took Coursera's big data specialization offered by UC San Diego, and the digital marketing specialization from UIUC. Later, I found another ML job at Amazon, using a lot of big data tools (Hadoop, spark, etc.) on AWS. After a year of settling down in San Francisco, this year, I decided to pick up the knowledge in deep learning. The first course of DL was fundamental but contained so much information that sometimes I needed to review several times because I forgot many statistical theories back in school. I thought it'd be very hard course but Andrew did a great job designing the curriculum where the theory and the application have a great balance for working people like me to start with. The amount of homework was much easier than I anticipated. I think for students who want to take the real challenge of coding, should hide Andrew's hint and write own functions. Overall, I like the Coursera courses and will continue to learn.

创建者 AEAM

•Jun 11, 2019

This course is great! I wish they would release a new version of the course where the math is visually explained instead of just handwriting by Dr. Ng. I think having to work with a small tablet really hapered his ability to develop the ideas as he was always trying to pack a lot of information on one ipad screen I would think that he could just stand in front of a white board and write on it with maybe hiring a sound technician this time? because despite the really high quality content of this course the audio is terrible and with the ipad screen not really doing justice to the writing, it really takes multiple viewings to figure out what's going on.

I would also suggest that Dr Ng really should explain when which one is which when he is using Y vs y and X vs x ... I'm sure it's crystal clear in his mind but for newbies like me, it can be confusing at times when there they write x but mean X (and vice versa)...

I still think this course is brilliant and it really cleared many concepts in my mind. It answered a lot of questions I've had after watching the fast.ai course. So if you're doing the fast.ai courses, you should definitely at least audit the deep learning.ai specialization courses and tbh, $50/mo is a steal for the calibre of information that is on offer (video/audio and ipad issues notwithstanding )

Work through it and you will find it extremely rewarding! Don't give up, keep going and if you feel frustrated, take a break and rewatch the videos the next day after a good night's sleep. It really helped me that I watched and rewatched video lectures, did the quiz, failed and came back to understand why I couldn't answer quiz answers. Good luck to all and Thank you to Dr Ng for making this available to us free of charge (if we wish to audit) I would buy the specialization though, since it is worth every penny and then some!

创建者 Dave J

•Feb 09, 2020

Good introduction to implementing shallow and deep neural networks in Python. If you have no knowledge of neural networks or Python, I'd suggest doing a little preparatory study first so that you know what a neural network is and feel comfortable writing short Python programs.

Theory: the course is not heavy on machine learning theory. I had covered the theoretical parts previously in other courses. This course provided a useful summary of these and left me feeling confident that I had a good overview.

Maths: this course doesn't place great emphasis on the mathematics. It shows you the relevant equations, with the emphasis on understanding the underlying concepts rather than going through detailed derivations. Sometimes there's an optional extra video going through the equations in a little more depth. A frequent message is: don't worry if you don't understand all the mathematical detail, you can still learn to implement neural networks effectively.

Implementation: the course uses the Python NumPy library throughout. It does not go into deep learning frameworks such as TensorFlow or PyTorch. From the outset, you are taught to use NumPy in an efficient ("vectorized") way. The programming exercises are well thought through and I found that they all worked smoothly, a pleasant change from some other courses elsewhere.

Overall I found this to be a gentle but satisfying introductory course to the Deep Learning specialisation. Andrew Ng is an excellent teacher. His manner is both calm and enthusiastic and he clearly cares about equipping students with the skills that they need and doing so in an accessible way. The optional "Heroes of Deep Learning" interviews were particularly interesting, full of gems and hints about what could lie ahead if you decide to go more deeply into the field.

创建者 Dejan Đ

•Nov 06, 2017

TL;DR: Very much worth taking if you're looking to get into the field, develop (much) deeper understanding of the underlying theory and the necessary infrastructure.

I first gave it 4 stars and then changed to 5, let me tell you why. If you're reading this review, you are most likely considering taking this course and you very likely have some idea about what Deep Learning is supposed to be. You're also probably aware of the "black magic" stigma surrounding the field and that it is going to take some time to get used to the way of thinking, even though if you have some experience in "conventional" machine learning. Well this course (read: it's creators) also understands all of those points extremely well. With that in mind, the course caters to people who are are making their first steps in the field of DL, people who are not expected to have a high degree of expertise in dealing with DL models and especially not in creating those. Students are expected to understand about 85% of the underlying theory in order to get the models working (the rest is mostly calculus needed for deriving certain more difficult gradients) and the coding assignments include a considerable amount of hand-holding. That fact made me want to say how the course was trivialized in a certain way, and it really is (but don't let this discourage you; you will still need to implement all of the key parts and do take your time to really understand what they do), but then I thought about that again and concluded that I most likely would have struggled to complete the course otherwise. Andrew Ng and the deeplearning.ai team had a wonderful approach to teaching this course, it kept me coming for more and I cannot wait to start with following courses in the specialization.

创建者 Kevin M

•Apr 10, 2020

Terrific course with 4 solid weeks of learning. The journey includes logistic regression for classification, shallow neural networks, deep neural networks, and building your own picture classification NN.

The virtual classroom lectures, quizzes, and programming assignments test your knowledge every week.

NN initialization, forward propagation, cost function, loss, backward propagation, gradient descent, and prediction using your trained model to classify pictures (in this case cats)

The coverage of the Calculus and Linear Algebra that are the basis for algorithms is explained in a way that builds a solid foundation without deep knowledge of the fundamental math behind the activation functions (sigmoid, relu, and tanh). A good understanding of matrix math, especially matrix multiplication, is a benefit to help navigate the course

The programming assignments use numpy python and are conducted in a Coursera frame Juypiter notebook. Strongly recommend beginners take the python tutorial as the syntax challenges can burn a lot of effort that can take away from the NN learning experience. Also be mindful of stability issues that can cause erroneous results (restarts of kernel) and can cause lost worked due to failed auto saves

The volunteers that help on the message board are quite good! Thanks Paul!

Finally, Professor Andrew Ng truly knows his stuff, presents in an understandable way, and communicates the excitement he has for the topic. Having taken a previous machine learning course (Stanford Machine Learning also offered by Coursera), Professor Ng is a world class instructor and Data Scientist

Best of luck!

创建者 Sebastian S

•Dec 15, 2017

I found it very helpful as it confirmed most of the things I had already learned by doing deep learning projects on my own, as well as browsing additional literature on machine learning / deep learning and having done some internships where I had to apply these things. So for me personally, this course did not teach me anything ew, but organised and structured the knowledge in my head nicely by summarizing it very neatly. Also, some of the hints on implementation where helpful (like the numpy reshaping issue with arrays of shape (n,) as opposed to (n,m)). One thing I found is that deep learning can only really be understood if the covering of back propagation includes the low level derivatives + chain rule discussions; otherwise, you dont really "understand" whats going on. I appreciate that the course (just like the original "Machine Learning" one, which was excellent) tries to reach a broad audience that does not necessarily know analysis to the extent required for backprop, but maybe it would be a nice idea to include a "mathematician's point of view" on the backprop as an optional part. I found that in my personal studies, looking at backprop from the pure analysis point of view helped me a lot in "demystifying" deep learning and seeing it for optimization approach that it is. Having said that, I found the course very nicely structured, with very clear explanations and relatable applications. Thanks to coursera and Andrew for providing this great source of knowledge for free, I really appreciate these efforts! Sebastian

PS: I gave it 4/5 stars, but for some reasion the rating keeps getting stuck on 5.

创建者 Ryan F

•Jan 01, 2018

This was a very well-thought-out course for beginners in Neural Networks / Deep Learning. Andrew Ng sets a good pace; I was able to complete each week's lecture videos and assignments in less than 10 hours. Lectures were always clear and often went over things which would not directly be needed for assignments, but which will be useful to anyone planning to do work in this field. Andrew Ng was also very good about explaining where the mathematical equations came from, while stressing that it's not super-important to understand fully where they come from, as long as you're able to implement them.

I should add that I'm probably not the typical audience for this class --- I have an extensive math background but only just started programming a few months ago. Python code was scaffolded and commented in such a way that even a noobie to programming can follow and complete coursework, and I can say I've not only learned about NN/DL algorithms, but also a good deal about programming in python as well. One major topic that still blows me away is the speed boost we get from avoiding for loops and using vectorization instead.

The post-assignment interview videos were also interesting. Andrew Ng would interview a guest 'powerhouse' at the end of each week, and the topics covered there often went way beyond the scope of this individual course, and gave a much more broad overview of where we are now and where we seem to be heading in the near and long term.

创建者 Dilip R

•Mar 16, 2020

This is a wonderful course. I have been reading passively for about a year on resources related to ML and DL, but never got the full grasp of the concepts the way Prof. Andrew explained them. The quizzes where entertaining and insightful, as well as the programming examples.

I completed this 4-week course in about 2 days straight; some of the quizzes were 70/100 at my first try but then got to 100/100 after 1-2 tries. On the programming assignments I got 100/100 on the first try (except for the first one which didn't register my last 3 code answers -even though I typed and ran correctly - for which I had to restart the kernel, launched it again in incognito mode and after I was done re-ran all the snippets one last time just to be sure).

The hardest part of the course for me was to understand derivatives and overall calculus development and factorization because I had the necessary classes a few years ago, and honestly I wasn't very good at it back then either.

One thing I would suggest would be to improve audio quality, as well as editing the videos instead of providing a warning message before a video with errors because sometimes it's hard to follow the course part let alone spot the error itself.

Again, I would like to thank Professor Andrew Ng. and the Deeplearning.ai team, as well as the Coursera platform for providing such great realtime capabilities like the jupyter notebook and the automatic grading system.

创建者 Baili O

•Jan 23, 2018

This is a great course which covers some popular machine learning techniques such as regression. And then it moves to deep learning with neural networks with the techniques of forward and backward propagation. It is a good course for beginner and the homework are fairly easy. However, this course still leaves some unanswered questions which might be covered in the future courses such as how to select hyperparameters, why we choose this specific cost function, are there any other deep learning framework other than neural networks structure, any other application other than image recognition. In addition, for those who have some background in machine learning, the interview section is a bonus which talked about GANs.

There are somethings not very enjoyable as well. For example, the notebook is unable to download so people have to write it down otherwise when the course expired, it is quite hard to get the course material. The lecture notes are badly organized: you have to download every slides one by one. (or I just didn't find the right place to download). Thirdly, this course didn't talk too much about techniques that the industry is using. What I am trying to say is that I don't know if the techniques in this course is applicable in the industry, is it too simple or is it too old etc.

Overall, it is a very fun and educated course. I can't wait to jump into the next course.

创建者 Felipe P C N

•Jul 10, 2020

O curso concilia uma abordagem compreensiva - partindo de conceitos intuitivos/elementares - com uma profundidade respeitável - construindo com cuidado técnicas mais aprofundadas. As aulas são extremamente didáticas e bem elaboradas, o que torna a familiarização com os conceitos de redes neurais (e a matemática por trás deles) mais natural. As avaliações têm tanto um componente teórico (quizzes, mais voltados para verificação de conhecimentos) quanto um prático (exercícios de programação). Por mais que seja positivo haver o componente prático, eu diria que ele poderia ser um pouco menos "guiado" - na prática, o que eles pedem é que o aluno "complete as lacunas" em um programa já (muito bem) estruturado (e praticamente completado) pelos professores. Senti a necessidade de complementar esses exercícios com tentativas por conta própria, programando os algoritmos do zero, para de fato assegurar que consigo implementar o que aprendi. Esse passo é sensivelmente mais difícil que os programas do curso (afinal, implementar um projeto do zero é essencialmente desafiador). Mesmo assim, vejo que esse tipo de "iniciativa" de levar os conteúdos aprendidos a alguma aplicação/prática para além do pedido pelo curso é algo que idealmente deveria ser feito ao se estudar qualquer disciplina.

Excelente curso.

创建者 Zhenwei Z

•Feb 28, 2020

This course from the basic to the advanced, leading us to understand deep learning. From the initial logistic regression, then to the shallow neural network, and finally to the deep neural network, we gradually learned the neural network representation and calculation process, and finally began to implement the cat image recognition binary classifier. The course is very clear and logical, eliminating the tedious mathematical derivation, but still allowing us to understand all the mathematical details including calculation and vectorization. The assignments are done step by step, starting from the basic functions and gradually encapsulating, and finally constitute a complete neural network, which enables students to have a deep understanding of neural network and master knowledge from practice. It is worth mentioning that the setting of course gradient is reasonable, and the details that are difficult to understand in the previous course do not need to be understood all at once. In the later courses, the understanding of the previous knowledge points will be deepened repeatedly, and due to the foreshadation of other knowledge points, a more complete and comprehensive supplement will be provided to the previous knowledge. Looking forward to series two.

创建者 Xiao G

•Oct 07, 2017

This is my first completed course on Coursera.org!!! and I win a certificate with nearly full marks! I was very bad at coding .... although now I still not good at it, this course convinces me that even a poor guy in coding like me can finish and establish a neural networks! it truly boosts my confidence! Thanks Andrew Ng.

One thing to note for this course.... maybe it can improve later in this side. I feel quite easy and comfortable with all mathematical deductions.... however, when I code in week 4, the backward propagation, I nearly get lost.... I spend the whole afternoon and night to solve it and then finish assignment 2 in week 4 (from 2 pm till 10:03 pm)... I bet many people get stuck on that area. hmm, It's really easy to get lost and puzzled there. I guess there maybe some points we can figure it out more easily. Just a little advice.

Anyway, I love this course. It's a trigger for my coding area, though I'm a physics student and now in electrical engineering.... I still feel very comfortable about this course.

Thanks Andrew Ng!!! I don't know how to express my gratitude to you! If I did not take this course, I may never ever to attempt on Deep Learning, such a complex and advanced thing.

I will continue to study~~ see you

创建者 Luca C

•Jan 26, 2019

Pros: + You will understand clearly how things work and why they work

+ Provide mathematical insights for those who are interested (really a big plus wrt other courses)

+ Overall a simple introduction to Neural Nets. Even those who already have experience can benefit from this brush up (even at a fast pace: you can complete it in 2-4 days)

Cons: - Since it is quite basic material, those who are already accustomed to NN might want to jump to the second course of the specialization.

- You learn python by doing, but you will not get an understanding of python. I would suggest to get a little bit familiar with it in other ways (however, this is not a requirement to master the course).

Clear, quick overview of the basics of Neural Network. Provide even some mathematical justification, even if it is really not a requirment to fully understand it in order to succesfully achieve the course. However, IMHO it is always good to have at least some insights of the mathematics behind.

This course sets well the basics, but to be really able to work on your own projects I think it is a must to take the second course of this specialization.

创建者 Amit R B

•Nov 27, 2019

This course is truly deserving of its high ratings. Prof. Andrew Ng's extensive breakdown of the structure and function of neural networks work is unparalleled. For me personally this course has been of great help. The theory lectures made me understand just how these networks "learn". This course is a great beginning and I think, prepares the student well to learn more in depth and advance concepts of deep learning.

However, if you are looking to get hands-on experience building and training deep learning modes I would recommend checking out some free resources on YouTube with the Keras framework. I played around with with Keras following the YouTube channel Sentdex's Keras tutorials. Then took this course to get a more mathematical and theoretical understanding. Some students might find themselves a bit unprepared for the coding exercises, since the lectures are more focused on theory and math, showing little to no code. This is why I thinks this is a great (if not the best) 2nd course, but maybe not as helpful as a first introduction.

For a free first introduction, check out the channel 3blue1brown's videos on Neural Networks to get your feet wet, before diving further deep. ;P

创建者 charles

•Jun 09, 2020

I am seeing that people are complaining in the top review saying that the course is bad because there isn't stoppages for you to think and all that. Or like the code is relatively simple to write.

What I want to say is that this is how, in general, what you learn in school. And this is the exact style that is being taught in school. The professor isn't going to pause here and there for you, unless you raise your hand to ask questions. In my machine learning class there isn't even quiz to check your understanding. The only thing missing in this course is PROJECT. Which is something that a online course can't provide, and there is no way the tutors can mark all your project.

When you complain about the lab assignments, this is exactly the type of lab assignment that you get in school, just that perhaps in school the variables isn't even given (but that is because tutors are within range and you can ask them any time you want, but this isn't the case for online learning, they have to make this do-able for everyone)

For something that is free, I can't believe that people are complaining about the quality, this is on par with my school's standard, and my school isn't bad at all.

创建者 Randall S

•Oct 05, 2017

Dr. Andrew Ng is brilliant and it is so amazing to have access to this type of knowledge for less than I spend on Starbucks in two to three weeks. I am taking some online courses at a big name university (to the tune of $4,000 per course), and for the money, this is a real bargain and just as good if not better!

The thing I liked most about this particular course is that it showed us what's happening under the hood, and not just a course on how to use tools, nor is it all theory. Dr. Ng also introduced us to Geoffrey Hinton, the pioneer of backward propagation which was worth the price of admission alone.

That said, it was not so tough that I couldn't keep up. I would say that having some exposure to calculus would help, but it is not required. Also, you need to be more than just familiar with Python, but if you can spend a few extra hours per week on the course, you can work your way through it with just a familiarity with Python.

It has challenged me to keep going to the next level and complete the specialization -- AI is not rocket science -- at least not at the level of applying this knowledge. Being at Dr. Ng's level might be a different story.

Highly recommended!

创建者 Anand R

•Jan 29, 2018

To set the context, I have a PhD in Computer Engineering from the University of Texas at Austin. I am a working professional (13+ years), but just getting into the field of ML and AI.

I completed Dr. Ng's course on Machine Learning on Coursera first. I recommend that students of this course should first complete that course (or an equivalent one). This course was an excellent review of the basic concepts of Neural Networks. The lectures were well presented and the maths/equations were explained intuitively. The problem solving assignments were in Python (as opposed to Matlab). As before, Dr. Ng walked us through the assignments: hand-holding us through the solution. The quizzes were fairly challenging and helped me reinforce the concepts quite well.

I wish there were a few open problems (Kaggle style) at the end of the course so that the class students could compete with each other. It would be a good addition to the course. I would appreciate more real world examples throughout the course as well.

I look forward to completing the remaining courses! Thank you, Dr. Ng. Thanks you, teaching assistants. Thank you, Coursera. This is truly a wonderful course.

创建者 MD A

•Jul 18, 2019

Thorough and simple explanations that help internalize the deep learning concepts. Video lectures are very helpful. Listen more than once to clarify concepts. Very useful jupyter notebook exercises with solutions that provide knowledge reinforcement. Vectorized form of deep learning neural network equations enable development of clutter-free and faster scalable solutions. Before taking the course refresh your knowledge of linear algebra esp. basic matrix operation such as matrix size, transpose, and implementation in Python via numpy such as numpy.dot for matrix multiplication, numpy.multiply for element-wise multiplication. Familiarity of Python key:value dictionary data structure and retrieval of values via keys. This knowledge will build confidence to code the functions and methods for forward propagation, back propagation, and gradient descent to update weights and biases. Also pay some attention to how indices in square brackets are used to identify matrices for inputs, outputs, parameters (weights and biases), activation values/models, various layers of a neural network, and nodes in a particular layer (all explained well in lectures.

创建者 Alexander M

•Oct 15, 2018

I've been impossibly busy and first thought this was something i could play in the background while I did other work. Quickly it became apparent that data I had been used to with M.shape = (user/observation/etc, feature) was now the transpose. This took a simple few examples on paper to convince me why this was a superior notation for D/RNN architectures given numpy notation. I also at first thought that the bias should be added to W, X, for greater expressibility of the relationship y = g(WX) and for the backprop updates that require 'estimating' the W.T*g^-1(y) and g^-1(y)*X.T (where y is understood as the general activation after layer l and X is the general output of the previous layer), but now I see why separating the bias is useful -- it estimates the 'scale' of all the data at the output layer at once (estimating the unbalance in the marginal distribution, for example), whereas the other gradients come from estimating the perturbative deformation in the input layer, thus they are slightly different from the perspective of forward backwards distributional learning. Bravo, and thank you!

创建者 Mahesh G

•Aug 29, 2017

Thanks for the course. Very neatly explained on the background maths that happens in neural networks. This course will help you understand the step by step what happens within the network. The step by step procedure which is explained by Professor is great and he has repeatedly stressed the important steps to make it clear. Along with the explaining the formulas the assignment helps in implementing the formulas step by step and converting the whole thing to a neural network model, this is a great learning. One of the important thing covered in the beginning of the course is about vectorization, python broadcasting which is the key for neural network.

The pace at which Professor explained the concepts is good and easy to follow and the structure of the course is well laid-out which helps for the beginners.

One thing that could have been better is the assignments, current assignments are definitely helpful for beginners like me, but could have some more assignments which increases the complexity level (may be it is there in subsequent courses).

Overall very good course and helped me

创建者 Krishna k N

•May 18, 2019

I admire Professor Andrew Ng's patience in helping the students take baby steps by painting a big picture from each small pixel, just as how a neural network is built.

This course has given me great exposure to how neural network, although I realize I need to take a Python course to type code more freely and easily.

I'm going to do that next and then come back to the remaining courses in this specialization.

feedback - it's really hard to visualize some of these matrices and their dimensions used in a large neural network with so many parameters such as nx features, m training examples, n iterations, L layers with (nL, NL-1) weights, (nL,1) biases etc. I understand it's hard to show these matrices by writing as they are very large. I wish someone would develop a more "animative" way of illustrating these matrices that will make the intuition more stronger. for example, calculating forward_activation for all layers and all neurons across these layers by just passing X and parameters is a massive operation and the intuition stumbles purely by the scale of such a matrix operation.