Probabilistic graphical models (PGMs) are a rich framework for encoding probability distributions over complex domains: joint (multivariate) distributions over large numbers of random variables that interact with each other. These representations sit at the intersection of statistics and computer science, relying on concepts from probability theory, graph algorithms, machine learning, and more. They are the basis for the state-of-the-art methods in a wide variety of applications, such as medical diagnosis, image understanding, speech recognition, natural language processing, and many, many more. They are also a foundational tool in formulating many machine learning problems.

提供方

## 课程信息

### 学生职业成果

## 23%

## 22%

## 11%

### 您将获得的技能

### 学生职业成果

## 23%

## 22%

## 11%

#### 可分享的证书

#### 100% 在线

#### 第 1 门课程（共 3 门）

#### 可灵活调整截止日期

#### 高级

#### 完成时间大约为30 小时

#### 英语（English）

### 提供方

#### 斯坦福大学

The Leland Stanford Junior University, commonly referred to as Stanford University or Stanford, is an American private research university located in Stanford, California on an 8,180-acre (3,310 ha) campus near Palo Alto, California, United States.

## 教学大纲 - 您将从这门课程中学到什么

**完成时间为 1 小时**

## Introduction and Overview

This module provides an overall introduction to probabilistic graphical models, and defines a few of the key concepts that will be used later in the course.

**完成时间为 1 小时**

**1 个练习**

**完成时间为 10 小时**

## Bayesian Network (Directed Models)

In this module, we define the Bayesian network representation and its semantics. We also analyze the relationship between the graph structure and the independence properties of a distribution represented over that graph. Finally, we give some practical tips on how to model a real-world situation as a Bayesian network.

**完成时间为 10 小时**

**15 个视频**

**6 个阅读材料**

**3 个练习**

**完成时间为 1 小时**

## Template Models for Bayesian Networks

In many cases, we need to model distributions that have a recurring structure. In this module, we describe representations for two such situations. One is temporal scenarios, where we want to model a probabilistic structure that holds constant over time; here, we use Hidden Markov Models, or, more generally, Dynamic Bayesian Networks. The other is aimed at scenarios that involve multiple similar entities, each of whose properties is governed by a similar model; here, we use Plate Models.

**完成时间为 1 小时**

**4 个视频**

**1 个练习**

**完成时间为 11 小时**

## Structured CPDs for Bayesian Networks

A table-based representation of a CPD in a Bayesian network has a size that grows exponentially in the number of parents. There are a variety of other form of CPD that exploit some type of structure in the dependency model to allow for a much more compact representation. Here we describe a number of the ones most commonly used in practice.

**完成时间为 11 小时**

**4 个视频**

**2 个练习**

**完成时间为 17 小时**

## Markov Networks (Undirected Models)

In this module, we describe Markov networks (also called Markov random fields): probabilistic graphical models based on an undirected graph representation. We discuss the representation of these models and their semantics. We also analyze the independence properties of distributions encoded by these graphs, and their relationship to the graph structure. We compare these independencies to those encoded by a Bayesian network, giving us some insight on which type of model is more suitable for which scenarios.

**完成时间为 17 小时**

**7 个视频**

**2 个练习**

**完成时间为 21 小时**

## Decision Making

In this module, we discuss the task of decision making under uncertainty. We describe the framework of decision theory, including some aspects of utility functions. We then talk about how decision making scenarios can be encoded as a graphical model called an Influence Diagram, and how such models provide insight both into decision making and the value of information gathering.

**完成时间为 21 小时**

**2 个练习**

### 审阅

#### 4.7

##### 来自PROBABILISTIC GRAPHICAL MODELS 1: REPRESENTATION的热门评论

Overall very good quality content. PAs are useful but some questions/tests leave too much to interpretation and can be frustrating for students. Audio quality for the classes could also be improved.

The lecture was a bit too compact and unsystematic. However, if you also do a lot of reading of the textbook, you can learn a lot. Besides, the Quiz and Programming task are of high qualities.

Prof. Koller did a great job communicating difficult material in an accessible manner. Thanks to her for starting Coursera and offering this advanced course so that we can all learn...Kudos!!

The course was deep, and well-taught. This is not a spoon-feeding course like some others. The only downside were some "mechanical" problems (e.g. code submission didn't work for me).

Excellent course, the effort of the instructor is well reflected in the content and the exercices. A must for every serious student on (decision theory or markov random fields tasks.

Superb exposition. Makes me want to continue learning till the very end of this course. Very intuitive explanations. Plan to complete all courses offered in this specialization.

I have Actually Earned Three Years of my life (at least) and one possible patent because of this course.\n\nThank You Daphne Mam. God Bless Everybody Associated with it.

learned a lot. lectures were easy to follow and the textbook was able to more fully explain things when I needed it. looking forward to the next course in the series.

## 关于 概率图模型 专项课程

## 常见问题

我什么时候能够访问课程视频和作业？

注册以便获得证书后，您将有权访问所有视频、测验和编程作业（如果适用）。只有在您的班次开课之后，才可以提交和审阅同学互评作业。如果您选择在不购买的情况下浏览课程，可能无法访问某些作业。

我订阅此专项课程后会得到什么？

您注册课程后，将有权访问专项课程中的所有课程，并且会在完成课程后获得证书。您的电子课程证书将添加到您的成就页中，您可以通过该页打印您的课程证书或将其添加到您的领英档案中。如果您只想阅读和查看课程内容，可以免费旁听课程。

退款政策是如何规定的？

有助学金吗？

Learning Outcomes: By the end of this course, you will be able to

Apply the basic process of representing a scenario as a Bayesian network or a Markov network

Analyze the independence properties implied by a PGM, and determine whether they are a good match for your distribution

Decide which family of PGMs is more appropriate for your task

Utilize extra structure in the local distribution for a Bayesian network to allow for a more compact representation, including tree-structured CPDs, logistic CPDs, and linear Gaussian CPDs

Represent a Markov network in terms of features, via a log-linear model

Encode temporal models as a Hidden Markov Model (HMM) or as a Dynamic Bayesian Network (DBN)

Encode domains with repeating structure via a plate model

Represent a decision making problem as an influence diagram, and be able to use that model to compute optimal decision strategies and information gathering strategies

Honors track learners will be able to apply these ideas for complex, real-world problems

还有其他问题吗？请访问 学生帮助中心。