In the theory of stochastic processes. There are several types of continuity. First of all, there is a continuity of trajectories. We'll have discuss this when we looked at the Brownian motion. Secondly, there's stochastic continuity, and we will speak about this type of continuing while we both discuss Levy processes. But now, I would like to show another type, which is called continuity in the mean-squared sense. The introduction of this type is mainly argued by the fact that it can be easily checked looking the covariance function. So, the topic of this chapter is continuity in the mean-squared sense. What does it mean? So, we will say is that the process Xt, is continues at the moment t0 in the mean-squared sense. If, Xt converges to Xt0, when t goes t0 is the mean-squared sense, that is mathematical expectation of Xt minus Xt0 squared goes to zero when T goes to t0. This is a quite simple definition. There is one essential fact, which allows us to immediately check whether the process is continuous in this sense or not. Let me assume, mainly for simplicity mathematical expectation of Xt is equal to zero. Then the full name proposition holds. So, it's the first part of this proposition. If a covariance function is continuous at the moment (t0, t0), then, the process Xt is continuous in the mean-squared sense as the point T equal to t0. On another side is a process Xt is continuous in the mean-squared sense at point t equal to t0 and t equal to s0. Then, the covariance function is continuous at a point (t0, s0). Let me show the first part. So, we know that the covariance function is continuous at t0, s0 and it will show that Xt is continuous in the mean squared sense as the point T equal to t0. How to prove it? So, let me consider mathematical expectation of Xt minus Xt0 square. I can represent this mathematical expectation as mathematical expectation of Xt squared minus two mathematical expectations Xt Xt0, and plus mathematical expectation Xt0 squared. The first theorem, and this expression is equal to K(t, t). The second one is 2K(t, t0). And the third one, K(t0, t0). And since the function K is continues at a point t0, t0. We'll conclude that this sum converges to zero as T goes to t0. And therefore, the brought us Xt is continues in mean-squared sense at a point t0. Okay. That's for the second item, let me consider the increment of the function K at the point t0 as a zero. So, K(t, s) minus K(t0, s0) is equal to. And now, I will use the following method, I will sum and subtract the same expression, namely K at point t0, s. So, and I will divide the sum into two summand. So, the first summmand K(t, s) minus K(t0, s). And the second summand is K(t0, s) minus K(t0, s0). And that claims that both of these expressions turned to zero as t and s goes to t0, s0. Why this is so? Let me just consider the first expression, it's K(t, s) minus K(t0, s). This is mathematical expectation of Xt minus Xt0, multiplied by Xs. Here I used the assumption than the mathematical expectation of Xt is equal to zero. And now, I would like to applies a well known [inaudible] and equality. Actually, if I consider that epsilon twirlers of this mathematical expectation, I get that this expression is less or equal than variance of the first random variable multiplied by the variance of the second and I shall take a square root of all of this variances. And therefore, we'll finally get this less or equal, so square root of the mathematical expectation Xt minus Xt0 squared. And multiplied with the square root of the mathematical expectation Xs squared. And now, I show it applies effect that the process T is continuous in the mean-squared sense at a point t0. And therefore, the first element of this product turns to zero as t goes to t0. The same is true also for the second expression because here, we'll use the process Xt is continuous in the mean-squared sense at a point s0. And finally, we'll get the both of these expressions turned to zero. And therefore, the theorem is proven. This proposition has very interesting corollary which gives us a rather unusual property of the covariance function. So, the corollary is the following one. It turns out that the covariance function is continuous at any point (t0, s0), if and only if this function is continuous on the diagonal, that is at any point of the form (t0, t0). This is a rather strange property for a function depending on two real variables. In fact, it hold for any covariance function. But it isn't so simple to provide another example where such situation appears. As a proof of this corollary is very simple, we just should look at this proposition. In fact, let me assume that the function K(t, s) is continuous as the diagonal. Then, according to the item number one, we gather the process Xt is continuous in the mean-squared sense at any time point T equal to t0. But then, we can apply the second item of this proposition and get it in this case, the function K is continuous at any point (t0, s0). And therefore, from these statements at K is continuous as a diagonal, we'll mutely get that K is continues everywhere. So inverse statement is definitely also true. So, we have proven this corollary. This is basically all that I would like to tell you today, and I invite you to attend our next lecture.