Hello and welcome to our next lecture, which will be about Markov chains. Markov chains, as well as, Renewal processes, are two classical examples of discrete times that has hypothesis. So, this is a model system which change over discreet time according to some random mechanism. As you already know from our previous lectures, observations of a stochastic process are typically dependent. This property, a property of dependence, appears to be the most essential difference between the theory of stochastic processes, and other fields of stochastics which you learned before. I mean, probability theory and statistics. In fact, in probability theory and statistics, data assume to be drawn from independent identically distributed random variables. As a property of independence, gives rise for a lot of tasks like starting, limit laws for sums, or construction of asymptotic confidence intervals. Markov chains are widely used for modeling one special type of dependence. Let me start with a formal definition. So, a Markov chain is an integer time process S n. So n is integer zero, one, two, et cetera, which takes values in some space S, I will write this calligraphic S, which is known as state space. This space should be countable. And for this Markov Chain, the so-called Markov property should be fulfilled. Let me write it down. This Markov property stands the probability that S n is equal to some specific value j, given that S n minus one, is equal to i, n minus one et cetera, S zero is equal to i zero. This conditional probability should be equal to the probability that S n is equal to j, given that S n minus one, is equal i n minus one. So, this is the so-called, Markov property. And it should be fulfilled for any sets, i zero, i one, and so on, i n minus one j from S, such that this conditional probabilities are defined correctly. This means that for all such sets, such that the probability of S n minus one equal to i n minus one, and so on, S zero is equal to i zero, should be not equal to zero. So, S is known as state space and at any time moment n, the system is in one of the states from this space. Normal level denote the states by integer numbers one, two, three, and so on. But in some situations it will be more convenient to consider other notation for the elements from this space. Let me now explain what kind of dependence is modeled by Markov Chain. Let me first mention, that if the Markov property is fulfilled, then the probability that S n is equal to i n, S n minus one is equal to, i n minus one, and so on, S zero is equal to i zero. This probability is equal to the probability that S n is equal to i n, given that S n minus one is equal to i n minus one, and so on, S zero is equal to i zero. Multiply it by the probability that S n minus one is equal to i n minus one, and so on, S zero is equal to i zero. Now we will apply the Markov property. This property yields that this conditional probability is equal to the same probability, but only when S n minus one is equal to i n minus one is considered. So, we will cross all of this conditions and get that this probability is equal to these conditional probability multiplied by this probability. Next we can imply the same arguments for this probability many times. And finally we will get, this is equal to the probabilities that S n is equal to i n, given that S n minus one is equal to i n minus one, multiplied by the probabilities that S n minus one is equal to i n minus one, given S n minus two is equal to i n minus two, and so on. And the last one will be probability that S one is equal to i one, given that S zero is given to i zero, multiplied by the probability that S zero is given to i zero. Well, what does this equality mean? What does it mean is that this probability is equal to the product of such conditional probabilities. Let me first mention, that if S n, S n minus one, at zero, will be independent, then this equality is also true, but all conditional probabilities are in fact, unconditional. So, this property basically means, that the values S n, S n minus one, at zero, are something like pairwise dependent. So, if we consider pairs of these random variables, it kind of independence, but only when you consider pairwise them. And this is exactly a kind of generalization of the independence property. Let me also mention the meaning of the Markov property here. Basically it means, that if we associate n with present, and n minus one was the most recent past event, S n minus two is the second past event, and so on. We get that the distribution in present, given the history of this process is completely determined by the most recent event. So, only the most recent past event determines present event. This is exactly the meaning of this property, and once more, this interesting property gives arise for a special kind of dependence, sounds like pairwise dependence. And it turns out, this kind of dependence appears in many situations, both mathematical situations and real life situations. And let me now provide a couple of examples of Markov Chains. Our first example is a so-called Random walk. This is a very classical stochastic process. Random walk is defined as follows. As the time moment zero is equal to zero. And then, it is defined intellectually at time point n is equal to S n minus one plus Psi n. Whereas, Psi one, Psi two, and so on, is a sequence of independent identically distributed random variables. With this distribution, which can be written as follows. It is equal to one, and minus one, with probability P, and one minus P respectively. The trajectory of this process is very simple. At any time moment zero, one, two, three, it either increases by one or decreases by one. So, for instance, a trajectory can be the following one, and so on. Well, let me now show that random walk is a Markov chain. In fact, let me consider the probability that Sn is equal to j, given that Sn minus one is equal to i n minus one. This probability is equal to P if j is i n minus one plus one, it is equal to one minus P if j is i n minus one minus one and is equal to zero otherwise. If you now add here some more conditions, for instance, you add here that Sn minus two is equal to something and so on, as zero is equal to something, then you'll get exactly the same answer. So n present is defined only by the most recent past event. Therefore random walk is a Markov chain. By the way, any renewal process is also a Markov chain by the same argument. But in this case, we get an interesting Markov chain because it will take any value from a state space only once, at least once. By the way, Random walk is not a renewal process because it can take negative values. So, once more, we conclude that Random walk is a Markov chain. Moreover, any renewal process is also a Markov chain, but an interesting Markov chain. Well, let me provide some further examples. Second example is also kind of classics. So, it's taxis in the airport. Let us imagine a following situation. There is a taxi station in the airport and at any time moment, one taxi is arriving to the station. So, one taxi at any moment. We'll assume that at the moment zero, there is no taxis and no passengers at all. If nobody is waiting for a taxi, then the taxi immediately leaves. If somebody is waiting for taxi, then let me assume for simplicity that a taxi takes only one passenger and also immediately leaves. So let me denote by Xk amount of people waiting for a taxi at time moment k. And let me denote by Yk number of people, new people let me say arriving at k. Let me now show that the process Xk is in fact a markov chain. In fact, if you consider Xk, so that turns between Xk minus one and Xk, we get the following formula: that Xk is equal to Yk plus Xk minus one minus one, plus. So, it is equal to Yk if Xk minus one is equal to zero and it is equal to Yk plus Xk minus one, minus one if Xk minus one is larger than zero. So you see Xk is defined via Xk minus one by the following formula and this basically means that the value at present depends on the past only through the most recent event. Therefore, in this case, Xk is a Markov chain and so we get a second example of Markov chain. Let me provide one more example. This example shows that Markov chains appear much more often than you might think about it. Example is the following one. Let me assume that the process Xn is such that the probability that Xn equal to j given Xn minus one equal i n minus one and so on, X0 equal to i0 is equal to the probability that Xn equal to j given. And here, I slightly change the Markov property. I write the condition not only on the past event but on some fixed number of past events, for instance, M past events. So Xn minus one is equal to i n minus one and so on, Xn minus m is equal to i n minus m. So, m here, is some fixed natural number and you have something like a window. So, value at present depends on m previous events. Okay. What we can say about this situation, there's no doubt Xn is not a Markov chain but what we can show in this case, that we can define the process Sn as a retro process that is equal to Xn and so on, Xn minus m plus one. And here, n starts from m minus one [inaudible] with m and so on. And well, we can show that this Sn is in fact a Markov chain. So, even in the case when present depends not only on the most recent event but on m most recent events. In this case, we can certainly combine the values of this process and get the Markov chain. So therefore Markov chains appear much more often than you might think about it. Let me now discuss some basic properties of Markov chains. First of all, I would like to discuss how it can be numerically characterized in the context of matrix theory.