Chevron Left
返回到 Bayesian Methods for Machine Learning

学生对 国立高等经济大学 提供的 Bayesian Methods for Machine Learning 的评价和反馈

495 个评分
138 条评论


People apply Bayesian methods in many areas: from game development to drug discovery. They give superpowers to many machine learning algorithms: handling missing data, extracting much more information from small datasets. Bayesian methods also allow us to estimate uncertainty in predictions, which is a desirable feature for fields like medicine. When applied to deep learning, Bayesian methods allow you to compress your models a hundred folds, and automatically tune hyperparameters, saving your time and money. In six weeks we will discuss the basics of Bayesian methods: from how to define a probabilistic model to how to make predictions from it. We will see how one can automate this workflow and how to speed it up using some advanced techniques. We will also see applications of Bayesian methods to deep learning and how to generate new images with it. We will see how new drugs that cure severe diseases be found with Bayesian methods. Do you have technical problems? Write to us:



Nov 18, 2017

This course is little difficult. But I could find very helpful.\n\nAlso, I didn't find better course on Bayesian anywhere on the net. So I will recommend this if anyone wants to die into bayesian.


Jun 07, 2019

Excellent course! The perfect balance of clear and relevant material and challenging but reasonable exercises. My only critique would be that one of the lecturers sounds very sleepy.


1 - Bayesian Methods for Machine Learning 的 25 个评论(共 132 个)

创建者 Daniel

Sep 14, 2018

The topic covered is great but could be improved. I understand that it can be difficult for a foreigner to speak English but that doesn't help to understand the rather technical course. Besides, the formula are given just as is with little intuitive explanation. Example to follow is A. Ng's ML/ AI course which gives a good tradeoff in terms of rigour vs. intuition. Plus I had to purchase some other off line material to better understand "Pattern recognition and Machine Learning" by C. Bishop - which is excellent - to better understand many concepts.

****Generally proper reading material of a couple of pages per lesson should be given. Slides nor audio transcripts, which are less rigorous, are not enough to cover such difficult and technical topics ***

Also the peer review is cumbersome and for me doesn't add value and slows down the certification process. Automatic grading or AI grading would be great !

创建者 Adam C

May 15, 2018

Good attempt, but rough around the edges. The instructions don't cover all of the content in the quizes. There are "tricks" in the quizes and the answers are not-obvious at times, or there are caveats unknown to you. But you get the answers once you fail and read the reasoning. Unfortunately, the notation is a little sloppy and inconsistent at times throughout the lectures. Examples could be completed further. This is a senior undergraduate or graduate level course and without accompanying reading material you have to take a lot of notes through the lecture, pausing the video often. If you're new to this material, the time spent on this course is much greater than the time spent on other Coursera courses due to its high level. I have a PhD in physics, so I have the mathematical capabilities. But I'm relatively new to Bayesian statistics. This course seems to be covering material form Bishop's "Pattern Recognition and Machine Learning" text.

创建者 Karishma D

Mar 25, 2019

Lots of maths! :). Assignments were very interesting as well.

But overall, this has been my favourite course so far. I like how in depth the lectures went into the maths (made me feel like I was back at uni). However, if I did not have a maths + stats background (from university), I think I would have struggled to keep up with the content

Couple of comments though:

1) For the MCMC week, it would have helped my understanding if we had to fit a Bayesian model to a dataset from scratch via our own implementation of Metropolis Hastings for example in addition to using the pymc3 library.

2) For the Gaussian Processes week, it would have helped my understanding if we had to fit a GP to some data via our own implementation in addition to using the GPy library.

创建者 Jayaganesh G

Nov 18, 2017

This course is little difficult. But I could find very helpful.

Also, I didn't find better course on Bayesian anywhere on the net. So I will recommend this if anyone wants to die into bayesian.

创建者 Peter K

Sep 25, 2018

This course was really good - it started from easy things for beginners and ended with awesome aplication of bayesian neural networks. Since I have masters in Probability and Statistics I was familiar with most of the stuff and I must thank you fot the mathematics and some proofs. It's hard to find such nice math proofs in today's courses, so it is good for non-mathematicians to the science behind these methods.

Most of the lectures were quite good and for beginner who is willing to study many stuff himself it is good. But I must say that some quizes had questions which answers you couldn't find in the lectures. I recommend to add some more reading stuff mainly for beginners.

创建者 Samuel Y

Mar 26, 2018

Many more theoretical formulas and derivations than previous courses of the specialization, which might require quite a bit of probability theory knowledge. But it is really helpful to understand EM and VAE in depth as well as to use GPy/GPyOpt tools in practice. It would be better to have detail explanation for some quizzes.

创建者 Mark Z

Jun 04, 2019

This course course teaches you a lot of useful math. It might be hard to understand at times, but you will get through it. Assignments are good for getting to know python tools which implement mathematical concepts described in lectures. Overall the best course I've taken so far.

创建者 Luke B

Jun 07, 2019

Excellent course! The perfect balance of clear and relevant material and challenging but reasonable exercises. My only critique would be that one of the lecturers sounds very sleepy.

创建者 Vaibhav O

Apr 03, 2019

Great introduction to Bayesian methods, with quite good hands on assignments. This course will definitely be the first step towards a rigorous study of the field.

创建者 Yu Z

Mar 30, 2018

clear instruction and great insights to algorithm, I love it. Really regret for lacking the time to finish all the programming assignments.

创建者 Radosław B

Dec 31, 2018

Great mix of theory and practice, without the unnecessary tutorial-like stuff everyone can look up in their search engine of choice.

创建者 Zixu Z

Dec 02, 2018

Course content is excellent. However I hope it could have had more about MCMC. That part was pretty thin.

创建者 Wei X

Aug 27, 2018

appreciate the balance of introducing the Bayesian statistics and the application of machine learning.

创建者 Yanting H

Sep 18, 2018

A very detailed course for someone who wants to strengthen their statistical background.

创建者 Голубев К О

Oct 19, 2018

Great course with fine lecturers and deep immersion in Bayesian methods

创建者 Dongxiao Z

Oct 11, 2018

Learned a lot from this course. Thanks!

创建者 Alexander R

Nov 12, 2018

super helpful and very applicable!

创建者 Anmol G

Dec 06, 2018

One of the best in-depth course.

创建者 SagarSrinivas

Sep 29, 2018

Awesome. Worth it!

创建者 Max P Z

Apr 02, 2018

Tough but useful!

创建者 Ertan T

Apr 26, 2018

Superb Course

创建者 Daniel T

Aug 06, 2019

The material is good and a lot of effort went into designing this course. Nonetheless, it feels neglected and could use an update.

The presentations are somewhat muddled by notational abuse. Indeed, it's customary to shorthand every distribution as "p" and let the arguments remind you which variable it came from, e.g, p(x|y) is conditional density of variable "X" at x given that "Y" = y. But then "p(a|b)" could be a completely different function corresponding to random variables "A" and "B"; however, you could have a=x and y=b as vectors which amplifies confusion... And when many variables with different ranges are involved and there's no consistency between labels for the variables and labels for their values, one has to spend extra time deciphering the material. Keeping track of the random variables and adopting a more suggestive notation would go a long way. Also, in Bayesian context it helps to avoid the word "parameter" (other than hyper-parameter, maybe), e.g., the weights "w" themselves are just values of a random variable, which is no different than the data generating process or the latent variables.

The programming assignments contain a lot of missing or inconsistent instructions. Be prepared to sift through the forums to find what is really expected or how to fix the issues in the supplied code.

Overall, I get the impression the course is now maintained by the students. It would be nice to see a revision from the instructors.

创建者 Ehsan M K

Nov 25, 2017

In terms of quality of the material, this is one of the best courses I've taken from Coursera! Bear in mind that it is an advanced course and requirements are high. So if your math skills is at graduate student level, you can benefit from this course. The topics are very important and applicable. I really liked all the explicit and detailed calculations done step by step, though I can guess many would find them boring.

However, in terms of TA support, assignments design, it's one of worst courses I've seen in coursera! Instructors or TAs barely respond given few registrations in this release. Assignments miss a lot of things and become increasingly frustrating to work on!

创建者 Jean M A S

Jun 03, 2018

Well, this course is really good, very demanding, and rigorous. The main disadvantage is the forum. Chances are that nobody will answer your questions, so be prepared to have a raw experience of learning. But if you are serious, you will eventually finish the course, and learn a lot.

创建者 Maciej

Mar 24, 2019

Overall it's good. My problem is that most of this material is better suited to lecture notes and not a video. They're forcing it into a video since it's coursera. Couldn't get through a lot of the lectures, used a textbook instead.