So we have proven the formula for the density of the first inter-arrival time xi_1. This formula is given here, I will use for the argument letter t just to express that we are speaking about time. And now I'm going to prove the following formula which gives us the conditional density of the random variable xi_2 given xi_1. I claim that this conditional density is equal to lambda, small lambda(t+s), multiplied by the exponent in the power minus capital Lambda(t+s) plus capital Lambda(s). Why this formula is true? Let me box it. To show this formula, we should separately find the joint density of xi_1, xi_2, and then we should divide this density by the density of xi_1. So, let me start with the joint density of xi_1 and xi_2, and let me find the closed form for the distribution function of xi_1, xi_2. So xi_1, xi_2 (s, t) is the probability that xi_1 is less or equal than s, and xi_2 is less or equal than t. I will continue the line of reasoning here. So, actually, I will use a kind of total probability law, more precisely, I will write this probability as the integral from zero to s probability that xi_1 is smaller or equal than s, xi_2 is smaller or equal than t, given that xi_1 is equal to some fixed number y, multiplied by the density of the random variable xi_1 at point y, dy. Of course, we can simply cross away this term because y is smaller than s. And now we can express this conditional probability as the probability that the increment of the process N from y to t+y is larger or equal than one. I formally should write down given that xi_1 is equal to y, and multiply this by the density of xi_1. Now, we have the following conditional probability, and here we have an event which depends on the increment of N after time y. And what is given here is something which is related to the process N before time y. Therefore, these two events are independent, and the conditional probability is equal to unconditional. Finally, we can substitute here the exact form for this probability: it is equal to one minus exponent in the power minus Lambda(t+y) minus capital Lambda(y). And we can also plug in here the formula which is already proven. So we have small lambda of t exponent in the power minus capital, excuse me, of y, minus capital Lambda of y, dy. From this formula, we can definitely find the density function of xi_1, xi_2. To do this, we should take two derivatives: one with respect to t and another one with respect to s. It is more simple to take firstly the derivative with respect to s, because s appears as an upper bound in this integral, and according to well-known fact, this derivative is actually equal to the function under the integral, where we put s instead of y. So the first derivative therefore is equal to one minus exponent in the power minus capital Lambda(t+s), it should plus here, plus capital Lambda(s), multiplied by small lambda(s) exponent in the power minus capital Lambda(s). And now we should take the derivative with respect to t from this expression. You see that here we have actually a difference between these two functions, and the first function doesn't depend on t at all. Therefore its derivative is equal to zero, and if we take derivative with respect to t, we get small lambda in the point (t+s) multiplied by the exponent in the power minus capital Lambda(t+s), multiplied by small lambda at the point s and exponent in the power minus capital Lambda(s). Now, if we divide this expression by the density of the random variable xi_1 in the point s, we will definitely get on as the product of two functions Lambda(t+s) and exponent in the power of minus capital Lambda(t+s). I forgot here something, it's Lambda of s. And therefore, we will prove this formula. So, this observation completes the proof.