So let me shortly summarize what was in this lecture. So, actually, we have discussed that any Markov chain can be characterized by two types of representations. The first representation is the so-called matrix representation and second representation is graphic representation. Most important objects in matrix representation are of course, transition matrix P and also, a probability distribution Pk, which characterize the probability that Markov chain is in particular state one and so on, m of the step number k. And also, there is some generalization of this transition matrix, for instance, m-step transition matrix. And we have shown that m-step transition matrix is actually equal to the matrix P multiplied by itself m times. If you speak about graphic representation, the most important issue is that all states of a Markov chain can be separated and can be clustered in various classes of equivalence. And within one class of a equivalence, there is two common characteristics. The first characteristic is either states are recurrent or transient and another characteristic is a period of a state. So, matrix representation is ideally suited for algebraic issues, for computing probabilities and also, conditional probabilities. And graphic representation works very well if we would like to classify states and also find the essential difference between various Markov chains. Nevertheless, the connection between this matrix and graphic representation is not clear at the moment. And let me consider one special type of Markov chain, which actually is very important from theoretical and both practical point of view. I mean the so-called ergodic Markov Chain. Let me first give a definition looking at the graphic representation. So, we say that the Markov chain is ergodic if it consists only one class of equivalence. All elements from this class are recurrent and aperiodic. For instance, if we consider a Markov chain with six elements such that only transitions of the following type are possible, so you can just come from previous state and go to the next state. So with no doubt, there is only one class of equivalence in this Markov chain. Of course, all elements are recurrent but period of any state is equal to six, and therefore this chain is not ergodic. Nevertheless, if you add one more arc in this picture, for instance, this arc, then the period of each state will be equal to one. Because if you take, for instance, this element, then you can return to this element with six steps, one, two, three, four, five, six, and also with five steps, one, two, three, four, five. And the greatest common divisor of five and six is equal to one, so this chain is already ergodic. And if we now transfer this definition into matrix representation, it does how that the following proposition holds. Markov chain is ergodic if and only if there exists some natural number m, such that Pij of m, is not equal to zero for any i, and j, from the state space. This proposition gives us some relation between matrix and graphic representation. In fact, we have defined ergodic Markov chain only using graphic representation but we can transfer this definition to matrix representation. An interesting point is that if a Markov chain is ergodic, then this property I will denote it by star, then star is fulfilled for any m, larger or equal than capital M minus one squared plus one, where capital M, is the amount of elements of the state space. Let me now consider the properties of ergodic Markov chain more detailed.