After you have come to terms with the basic probability rules and got acquainted with the probability distribution, it's time to meet its alter ego, the cumulative probability distribution. In this video, I'll explain how the two distributions relate to each other and especially how you can make probability statements or find important values of a random variable by using the cumulative probability distribution. Let's quickly jump into it. Consider this simple discreet distribution. Can you answer the question what the probability would be that X takes a value of either 2 or 3? The answer is obtained by adding up the probabilities that x is 2 and x is 3, the union of the probabilities. So that is 0.7. All the probabilities listed in the table, or along the x-axis of the probability mass function are disjoint. So any union of probabilities is simply the sum of these probability values. Similarly the probability that X is greater than 1 is equal to one minus the probability of x is one. Which is 0.9 by the complement rule. Now let's step up to a next idea. Based on a probability distribution, we can easily calculate the probabilities for values that are less than or equal to a given value. For example, the probability that x is less than or equal to 1 is 0.1. The probability that x is less than or equal to 2 is 0.1 plus 0.3 which is 0.4, etc. The resulting probabilities are called cumulative probabilities. The list of all cumulative probabilities is called the cumulative probability distribution, or cumulative distribution function. The probability histogram for this cumulative probability distribution can be made as well, and looks as follows. Also, probability density functions. The probability distribution for continuous random variables and for corresponding cumulative distribution. Consider, for example, this probability density function. The corresponding cumulative distribution is given here. An interesting aspect of this step is that the y variable changes from a probability density to a probability, because in a cumulative distribution, it is the area from the smallest value of x, up to the value of interest in the probability, density function that is put on the y-axis. As you see, cumulative probability functions have continuously increasing values starting at zero and incrementing to a maximum of one. The sum of the probabilities for all the values that the random variable can take is one. The cumulative distribution, especially its graphical form, is very convenient because it can answer two questions. You can select a certain value of the random variable at the x-axis and then find which fraction of the observations will be lower than or equal to this value at the y-axis. Or reversely, you can select a fraction at the y-axis and then find the corresponding threshold value at the x-axis. There is in fact a shorter way of saying that the fraction of the values fall below a threshold value using the term quantile. For example, a threshold value below which 0.1 of the values are found, is called the 0.1 quantile. So the cumulative probability distribution is in fact showing the quantiles for a random variable. You find, for example, for the cumulative probability of 0.5 the median value of the random variable And for 0.25 the lower quantile. It is noteworthy that for symmetric probability distributions the median coincides with the mean. So for symmetric distributions, also the mean is found at the cumulative probability of 0.5. Let me summarize what I hope you understood from this video. A cumulative probability of a random variable is the probability of obtaining a value lower than or equal to a threshold value. Considered in the other direction, a cumulative probability specifies a quantile of a random variable. For example at the cumulative probability of 0.5 the median of the random variable is found. Just like a probability distribution the cumulative probability distribution can exist in the form of a table, a graph or an equation. It can be obtained by calculating the cumulative sum of the probabilities from the smallest up to the largest value of the random variable. And it is continuously increasing with increasing values of the random variable starting at zero and incrementing to a probability of one. At the cumulative probability of 0.5, the expected value, or mean, of the random variable is found. Just like a probability distribution, it can exist in a form of a table, a graph and an equation.