The primary topics in this part of the specialization are: greedy algorithms (scheduling, minimum spanning trees, clustering, Huffman codes) and dynamic programming (knapsack, sequence alignment, optimal search trees).

Loading...

来自 斯坦福大学 的课程

Greedy Algorithms, Minimum Spanning Trees, and Dynamic Programming

403 评分

The primary topics in this part of the specialization are: greedy algorithms (scheduling, minimum spanning trees, clustering, Huffman codes) and dynamic programming (knapsack, sequence alignment, optimal search trees).

从本节课中

Week 1

Two motivating applications; selected review; introduction to greedy algorithms; a scheduling application; Prim's MST algorithm.

- Tim RoughgardenProfessor

Computer Science

So now let's turn our attention to proving the correctness of this Greedy

Algorithm that we devised that proportatedly minimizes the some of the

waited completion times. Let me remind you of the formal

correctness claim. The claim is that our second greedy

algorithm. The one which looks at the ratio of each

job. The ratio of the weight to the length and

sorts them in decreasing order, is always correct.

For every possible input, it ouputs the, the sequence of jobs, which minimizes the

weighted sum of completion times. And as promised, it wasn't hard to devise

the greedy algorithm. It's certainly not hard to analyze its

running time, which is N Log N. The same time as sorting,

but it is quite tricky to prove it correct.

The way we're going to do that, is going to be our first example of what's called

an exchange argument, which is one of the few recurring principles in the

correctness proofs of greedy algorithms. So I'm actually going to give you proofs

of two different versions of this claim. Both will make use of an exchange

argument in slightly different ways. For starters that's going to be in the

next video and the next one. I'm going to make a simplifying

assumption that there are no ties amongst the ratios, that each job has its own

distinct ratio of its weight versus its length.

In this case, we will be able to get away with proof by contradiction.

So, on this slide lemme give you the high level proof plan and of how this is going

to go, we will start delving into the details on the next slide.

So on the high level we're going to begin by fixing an arbitrary instance.

By which I mean, you know just the description of the

weights and lengths that. So remember, we have to prove that our

algorithm is always correct. So we just fix an arbitrary instance, and

prove correctness on this arbitrary instance.

So, as I said, for the case with no ties, we're going to proceed by contradictions.

Remember, this means we assume what we're trying to prove is false.

And from that, we derive something which is obviously false inconsistent.

So, what would it mean to assume that this claim is false?

It means there exists an instances for which this greedy algorithm does not

produce an optimal solution. For which there's some other solution not

output by the greedy algorithm, which is better than that of the greedy algorithm.

So let me just give you some notation to set this up.

We're going to let sigma denote the greedy schedule.

And if our claim is false, that means this is not an optimal schedule there's

some other one which is better so call this better optimal sigma star.

So, to complete a proof by contradiction, we need to derive something which is

obviously false and the way we're going to do that here might strike you as

initially as a little weird but it turns out to work really well in this context.

From this assumptionm that the greedy algorithm is not optimal, and there is a

better scheduled sigma star, we're actually going to exhibit yet another

schedule which is even better than sigma star.

Strictly smaller picto.function value than sigma star has.

Why is that a contradiction? Well, by assumption, sigma star is

optimal so if you show that there is something even better than sigma star,

sigma star is not optimal and that completes the proof by contradiction.

So now lets start filling the details of this proof plan and making it rigorous.

So as I said in this video and the next we're going to be assuming that all of

the ratios are distinct. In general, of course that need not be

true and I'll give you a separate argument to handle the case of ties.

I'm going to make a second assumption. But, unlike the first assumption, the

second assumption has no content. It's just an assumption about notation.

I'm going to assume by renaming jobs, that job one, number one, is the one with

the highest ratio. Job number two is the one with the second

highest ration and so on. Job ending the one with smallest ratio.

As a consequence of this switch in notation the greeter schedule is very

simple to describe. It just schedules job one first, then job

two second, then job three third, and so on, all the way up to job N.

Okay, so we have one assumption which is not without loss of generality, and we'll

have a separate argument for handling ties.

We have a second assumption which is without loss of generality.

It's just an agreement amongst friends who want to minimize notation.

And now, lets actually derive something with content.

So given that the greedy schedule is just the jobs in order, one, two, three, all

the way up to N. And given our assumption that the greedy

solution is not optimal, and instead there's some other distinct optimal

schedule sigma star. I claim that sigma star must contain

consecutive jobs, that it is somewhere in the schedule, sigma star, I can isolate a

pair of jobs, one executed after the other, such that the earlier of those two

consecutive jobs has a larger index. I'm going to call these jobs I and J with

I being earlier. So again by virtue· of the optimal

solution sigma star being something other than the schedule one, two, three up to N

there must be two jobs somewhere in the schedule executed in a row one after the

other so that the earlier job I has a higher index then the subsequent job J.

Why is this true? Well the reasoning is that, the only

schedule. It has the property that indices only go

up as you go from the earliest job to the latest job.

The only way that indices will always go up is if you schedule the jobs one, two,

three, all the way up to N. There is no other schedule with a

property that indices always go up other than one, two, three, all the way up to

N. So this is an observation that's going to

be important in the rest of the proof, so make sure you pause, give yourself enough

time to stare at it and commit yourself it is in fact true.

Any, any schedule other than one, two, three, all the way through N, have to

have consecutive jobs, the earlier one having a higher index than the later one.

So I'm now in a position to explain the exchange in the exchange argument.

So let me just distill the two key points from the discussion so far.

So first of all we have changed notations so that ratios are decreasing with index

and this is exactly the same as the schedule that the greedy algorithm will

output. And then assuming that the optimal

schedule sigma star is something else we know it has consecutive jobs with a

earlier one having a higher index. Keep in mind, our high level proof plan

from the first slide of this video. Where, we're doing a proof by

contradiction. We need to derive a contradiction and

what we're going to do is we're going to exhibit a schedule even better than sigma

star thereby contradicting it's purported ophthalmology.

So how do we do that? We do that with an exchange.

So this exchange is going to take the place of methodic experiment.

We're going to take this purportedly optimal schedule sigma star and we're

going to switch the order just of the two jobs I and J leaving all of the other

jobs unchanged. So sigma star consists of various jobs.

It's called stuff collectively. then next is job I and after that

immediately is job J. And then there's possibly some more jobs

that get executed after J. And remember, we observe that, we can

chose I and J so that I has a higher index than J despite being scheduled

earlier. Then we execute this exchange.

The stuff before INJ is the same as before.

The more stuff after JNI, is the same as before.

But we're going to have them occur in opposite order.

And the key thing we have to understand next is, what are the ramifications of

this exchange? What are the costs?

What are the benefits? That's how we'll begin, the next video .