0:00

Hello, and welcome back. In this video, what we're going to do, is

Â we're going to show a different technique of imaging painting that is based on

Â calculus of variations. Instead of starting with a partial

Â differential equation from the very beginning, we're going to start with an

Â energy formulation that through the Eulerâ€“Lagrange is going to be transformed

Â into a differential equation. Now, we're going to keep the same

Â concepts that we learned before. If we have a region of missing

Â information. We're going to try to kind of mold in the

Â continuation of the edges as we represent here, and then fill in the color.

Â When I do both of them at the same time, as we are going to see in a second, but

Â we want to keep that important concept of.

Â Continuing edges and then. Letting the color flow in, as we say, as

Â water. Actually, why do I say as water?

Â Because some of these equations are very related to the same type of mathematical

Â equations that model, the motion of fluids and gases in nature.

Â So this analogy of letting the pixel values.

Â Flowing like water is not an artificial analogy because it is the same type of

Â equations what we are doing here are transport equations.

Â So we are going to basically derive a variation and formulation that does this

Â type of process. Now, let us start by assuming that

Â somebody came and already told us the boundaries.

Â Somebody basically say, Hey, I don't know really what's happened in there.

Â But here are the boundaries. I'm going to extend them for you.

Â And basically, it has given us the normalized gradient at every single pixel

Â inside the region that we need to inpaint.

Â Remember, the normalized gradient, we basically say that theta is the gradient.

Â Divided by its magnitude. Now, we have to be a bit careful when the

Â magnitude is zero. They say there is no gradient.

Â So, we basically leave that aside for the second.

Â So, this is what we were giving. Somebody gave us basically the gradient.

Â And if you pluck that here, you get gradient square divided by gradient which

Â is equal to gradient. And the goal now is to find that image

Â inside the region to be impainted. That is consistent with this gradient.

Â And we do that in a variation and formulation.

Â We basically are going to optimize for the image.

Â In such a way that is consistent with the gradient that we were given.

Â That consistency is exactly what we have here.

Â When the image holds this, This, this becomes zero and then what's inside here

Â is as small as possible. Once again the basic idea is you find an

Â image, then when you take its gradient and you normalize it you get what was

Â given to you. So you look for consistency.

Â And the interval is inside the region to be painted and a small band around it.

Â Where, basically we take the information to propagate in, and in that band we have

Â the real image, nurtures the even gradients, the real image.

Â And that's why, as I said before, you basically optimize in the region, and as,

Â in the region to be impainted and a small band around it.

Â So, this is your equation. You're looking for consistency of the

Â image with basically. The gradients that you were given.

Â The Euler-Lagrange of this, is what we have, here.

Â That's the Euler-Lagrange of this equation and, as we have seen many times,

Â we basically deform the image in time according to the Euler-Lagrange and then

Â we go to steady state. And we get to steadystate, we basical

Â that the divergence, remember this is for divergence of, this vector.

Â We get, basically the divergense of the normalize image is equal to the

Â divergence of the, graudience, the normalized graudience that we were giving

Â and then we got an image that is consistent with the information that we

Â were given. That's exactly what we wanted.

Â Now, if we do that, for example for this image We basic remove the I.

Â There's no I here. We block it.

Â But remember somebody gave us the gradients, the normalized gradients which

Â is equivalent to give us basically the edges.

Â And then we can recover. The grade values are not exactly as we

Â want them. We are going to repair that very soon.

Â But you can observe the basic geometry here.

Â And that's because somebody gave us theta.

Â Somebody gave us the normalized gradient. That's a first step.

Â Now, we need to basically take those normalized gradients, because in real

Â scenario we don't have nothing in there. And here is the complete equation so.

Â [SOUND]. Here is what we have before, we had

Â before. Somebody gave us theta, we basically

Â recover an image that is consistent with theta.

Â This is this term. What this term is doing.

Â Is exactly helping us to propagate the gradients inside the region to be

Â inpainted. I'm going to explain that more in just

Â one second. Again the integral is as before in this

Â same region and A, B and C are just some parameters that control basically the

Â weight between these two terms. And we are minimizing not only over

Â theta. Not, sorry, not on the over I, as within

Â the previous slide, we also normalize over theta.

Â If we had theta, this is all we need, but here is, we're building the edges, and

Â the image consistent with the edges, at the same time.

Â There, here is the consistency. And here is building the edges.

Â And here, what we have is as most continuation of edges.

Â As we saw, as we learn from professional restorators.

Â Let as look at this turn for a second. If theta is the normalized gradient.

Â 6:54

As we write it here. What is the divergence of this?

Â We saw that in the previous week. What's the divergence of the normalized

Â gradient? Let's think for a second.

Â That's the curvature. This term.

Â Is curvature. Of what?

Â Of the leg lines of my image, which are kind of edges.

Â So we're saying. Propagate the edges inside in such a way

Â that the curvature doesn't go crazy. If it, if the curvature basically goes

Â crazy, then we're going to be paying a lot of penalty for that.

Â P is a parameter that says, how much penalty do we want to pay for that?

Â So basically, we're doing a smooth propagation of edges in such a way that

Â the curvature is relatively mild inside, and at the same time, we're recovering an

Â image that is consistent with those edges that we're propagating.

Â This term is to make everything even smoother and more regular, and so it has

Â a mathematical justification. With this term, this compete formulation

Â you can prove a lot of beautiful math for it and without a term, the things become

Â a bit more difficult and are what is called ill-posed.

Â They're not very well defined. So it's a technical term that we add to

Â make this equation actually look okay. Now, how do we solve this?

Â Euler-Lagrange equations. And now we have two unknowns, theta and

Â the image. It's exactly the same, you compute the

Â Euler-Lagrange, like, I is constant, you compute all the Euler-Lagrange for theta,

Â that gives you one equation. And then you keep theta constant and you

Â compute Euler-Lagrange for I that gives you the second equation.

Â So instead of one equation as we have before when we knew theta, you get two

Â equations. One that is evolving eye, the other that

Â is evolving feet. And you are solving both of them at the

Â same time. So it's like fixing one for a short time,

Â solving for the other, fixing the other, solving for one of them.

Â then you iterate. And you get this.

Â Copy. Partial differential equation on both I

Â and theta, but it is very elegant, because we can think about what we want,

Â and with a sign of variation of formulation that, that achieves that for

Â us. We want the image to be consistent with

Â the edges. We've assigned that.

Â We want the edges to continue smoothly. We know smoothness is curvature.

Â We design for that. We put that into an energy that penalizes

Â for not achieving what we want and then we compute the Euler-Lagrange and we

Â solve that equation. Let us see some examples.

Â Again we start by artificial examples, so we block these line.

Â So basically, this is what we get. By blocking it, and by running this

Â equation we'll smoothly continue it. A very nice continuation.

Â Similar example as we had before, we block this and we get a smooth and nice

Â continuation of this. We do the same here.

Â We block this cross and we get a smooth continuation.

Â Now you may ask: why did we get a smooth continuation of the bright and not the

Â dark? Now this is a deterministic algorithm

Â when you implement it, basically it chose one over the other.

Â There's no way for the algorithm to know which one is the one that you might want.

Â Here we have the famous at symbol. We block it and look.

Â We get not the add symbol, we get this. Now, this is examples like the chair I

Â show in the very beginning of this week. The computer doesn't know the @ symbol.

Â The computer is just looking for the minimal energy to complete.

Â Now the completion, the feeling that painting is beautiful is perfectly fine.

Â Is not the add symbol we started. But remember imaging painting is trying

Â to give you back something that looks okay and there's no way for the computer

Â to know that what you wanted is to get back the @ symbol, not in this fashion.

Â So you get a perfectly fine, actually this has lower energy according to the

Â energy that we just defined that the @ symbol and that's why it went into that

Â one. So perfectly fine, but without this

Â higher level information of the @ symbol which cannot be achieved with these type

Â of local algorithms for inpainting. It works nice for artificial examples.

Â Let's see if he works for real ones. Here, there is a nice picture of Groucho

Â Marx. And then we have regions that we want to

Â in paint. Here, we kind of see an evolution.

Â And this is the result. A nice recover along filling in along

Â those regions. Another example as we have seen before,

Â we have letters and we get the letters removed.

Â So the basic image pictures are flowing in by solving this variational

Â formulation. Now, those are simple examples.

Â And I want to conclude these examples with going back to the very beginning

Â when I say. We are in painting, structure, and

Â colors. What about kind of, the noise, the

Â granularity of the image? This is one way of doing that, and even

Â going beyond that. So you start from an image.

Â And these are the regions that we want to in paint.

Â You first take this image and decompose in two parts.

Â One, and I'm going to say in second how we do that.

Â Let me just explain the diagram first. One, it's like this was smooth.

Â It has no granularity. No texture at all.

Â And the others were all the texture has done.

Â Okay? Now you in paint the structure part with

Â the techniques I just mentioned to you. You basically continue the boundaries and

Â continue these flat colors inside. And you get a very beautiful

Â continuation. The texture, the granularity, you use a

Â technique that is designed to impend granularity.

Â We're going to explain that technique actually in the next video.

Â So see for example how here the texture has been very nicely continued.

Â Here. And then you go and add those two again.

Â And you get that great reconstruction that has this structure continue.

Â Has the colors continue. Has also the texture and the granularity

Â continue. Here's the structure and the colors.

Â Here is the texture. How do we do this decomposition?

Â There are a few techniques out there that follow something developed by Yves Meyer.

Â Just to give one example and we get this decomposition into something which is

Â pitch-wise smooth, so it has not a lot of oscillations and variations and something

Â that has all the oscillations and variations.

Â You in paint here with the techniques that I just showed to you.

Â Either the partial differential equations base one.

Â Or the variational one, for example. You in painted texture.

Â Here, the granularity was something we are going to discuss next.

Â Which is kind of a cut and paste type of technique.

Â And then you add them back. And you get a beautiful reconstruction.

Â And you get everything in one image. So in the next video I'm going to talk a

Â bit more about this type of cut and paste technique.

Â You actually already know it. And I'm going to remind you what it is in

Â the next video. I'm looking forward to to that.

Â Thank you very much.

Â