创建者 Rohit D•
Bayesian Statistics: Mixture Models (BS3 for short)
In June 2020, BS3 is a new class. It appears that this class came to Coursera circa April 2020.
The class creators (Prof. Abel Rodriguez and others) have done an excellent job of pulling-together the requisite theory (video lectures) and practice (assignments in R).
For most people, including those with a modest amount of training in statistics or computer science, this class will feel like an advanced class. To reasonably comprehend the material, one needs to be familiar with Monte Carlo simulations (specifically Gibbs Sampling) and a broad spectrum of probability distributions (Poisson, Beta, Gamma, Inverse-Gamma, Log-Normal, Dirichlet) used in Bayesian statistics. The first two Bayesian Statistics classes cover most of these pre-requisites well.
BS3 delves into two ways of estimating mixtures, namely Expectation-Maximization (EM) and Gibbs sampling, and comparing results from the alternate approaches. BS3 does not stop at a "Gaussian Mixture of two Univariate distributions." Through its assignments, this class motivates the need for other mixture models such as zero-inflated Poisson distribution, a mixture of exponential and Log-Normal distribution, and a mixture of multivariate Gaussian distributions.
Some assignments require manipulation of hierarchical probability distributions using multiple techniques - Maximum Likelihood Estimation, detecting Conjugate Priors, Simulations - simultaneously. Since the manipulations are coded in R and need to achieve a numerical result, typos and algebraic manipulation errors are unforgiving.
The class organizers chose to have graded assignments (six in all) peer-reviewed. The peer review requirement can feel like a constraint for a class that is relatively new and advanced, and thus has low attendance.
It took me ~60 hours to complete this class over approximately two weeks. Ideally, I would have preferred to spread the course out over the recommended five-weeks. Life constraints dictated otherwise. Even so, the effort is well worth it. I am walking away with a much better appreciation of Bayesian Statistics in general and Mixture Models in particular.
创建者 zj s•
This course seems to be a new course only opened in recent months. Compared with other courses on coursera, its attention is not very high. The course mainly introduces the (Bayesian) generative mixture models and the method of parameter inference (EM/MCMC, not involving variational inference). Strictly speaking, it is a small branch of machine learning.
This is a course that combines principles and practice. It mainly uses Gaussian mixture/zero inflated mixture models as examples to explain its principles and derivations, and is supplemented by demonstration codes to help readers understand. At the same time, there are corresponding homework to help readers better understand related concepts. If the homework is done carefully, most of the knowledge points involved in this course should be mastered.
The difficulty of the course is moderate for me. According to coursera, it is advanced level, and the official estimate is about 21 hours. Of course some pre-knowledge is required.
创建者 Rajendra A•
Excellent course material, video lectures and programming assignments. Learnt EM and MCMC. Initially thought that the programming assignments would be difficult but after following videos and instructions, I started gaining confidence. Highly recommending this course.
创建者 Marcelo B•
Good course, it goes deep whenever it is needed. Great lectures. The exercises are presented according to the lectures. I have enjoyed the course and would suggest it to anybody who wants to close some corners in the Bayesian statistics.
创建者 Suraj M•
I learned a lot about bayesian mixture model, expectation maximization, and MCMC algorithms and their use case in classification and clustering problems. I highly recommend this course.
创建者 Rick S•
Just enough theory and practice. Great class.
创建者 Dziem N•
This course is the best of the series from UC Santa Cruz. The lecturer explains the rather complicated concepts with clarity. I find the examples are really helpful to further grasp the concepts.
创建者 Chow K M•
Definitely quite mathematical in nature. Good way to learn about expectation-maximisation algorithm.
创建者 Rahul S•
Very good course.