Probabilistic graphical models (PGMs) are a rich framework for encoding probability distributions over complex domains: joint (multivariate) distributions over large numbers of random variables that interact with each other. These representations sit at the intersection of statistics and computer science, relying on concepts from probability theory, graph algorithms, machine learning, and more. They are the basis for the state-of-the-art methods in a wide variety of applications, such as medical diagnosis, image understanding, speech recognition, natural language processing, and many, many more. They are also a foundational tool in formulating many machine learning problems.
The Leland Stanford Junior University, commonly referred to as Stanford University or Stanford, is an American private research university located in Stanford, California on an 8,180-acre (3,310 ha) campus near Palo Alto, California, United States.
- 5 stars
- 4 stars
- 3 stars
- 2 stars
- 1 star
来自PROBABILISTIC GRAPHICAL MODELS 3: LEARNING的热门评论
very good course for PGM learning and concept for machine learning programming. Just some description for quiz of final exam is somehow unclear, which lead to a little bit confusing.
Great course! Very informative course videos and challenging yet rewarding programming assignments. Hope that the mentors can be more helpful in timely responding for questions.
Great course, especially the programming assignments. Textbook is pretty much necessary for some quizzes, definitely for the final one.
Great course! It is pretty difficult - be prepared to study. Leave plenty of time before the final exam.
关于 概率图模型 专项课程
Learning Outcomes: By the end of this course, you will be able to
Compute the sufficient statistics of a data set that are necessary for learning a PGM from data
Implement both maximum likelihood and Bayesian parameter estimation for Bayesian networks
Implement maximum likelihood and MAP parameter estimation for Markov networks
Formulate a structure learning problem as a combinatorial optimization task over a space of network structure, and evaluate which scoring function is appropriate for a given situation
Utilize PGM inference algorithms in ways that support more effective parameter estimation for PGMs
Implement the Expectation Maximization (EM) algorithm for Bayesian networks
Honors track learners will get hands-on experience in implementing both EM and structure learning for tree-structured networks, and apply them to real-world tasks