The primary topics in this part of the specialization are: greedy algorithms (scheduling, minimum spanning trees, clustering, Huffman codes) and dynamic programming (knapsack, sequence alignment, optimal search trees).

Loading...

From the course by Stanford University

Greedy Algorithms, Minimum Spanning Trees, and Dynamic Programming

349 ratings

The primary topics in this part of the specialization are: greedy algorithms (scheduling, minimum spanning trees, clustering, Huffman codes) and dynamic programming (knapsack, sequence alignment, optimal search trees).

From the lesson

Week 2

Kruskal's MST algorithm and applications to clustering; advanced union-find (optional).

- Tim RoughgardenProfessor

Computer Science

In this video we'll provide for the first time concrete evidence that the lazy

Â union approach to the Union-Find data structure is viable.

Â Specifically, we'll prove that the worst case running time of both the find and

Â the union operation is logarithmic in n, the number of objects stored in the data

Â structure. We are going to do even better later

Â once, we introduce a second optimization known as path impression.

Â But an important stepping stone is to understand just is why, just union by

Â rank, already gets us to a reasonable algorithmic run-time.

Â So a quick review of the lazy union approach to implementing the union fine

Â data structure. So, with each note we're going to

Â maintain a parent pointer. And it's no longer the case that we

Â insist the parent pointer point directly to the leader of a group.

Â Rather we just insist that the collection Collection of parent pointers, induces a

Â collection of directed trees. The root of each tree, that is, the node

Â which is it's own parent, we're going to define as the leader of that group.

Â So, given any old object, x, how do you implement find? How do you, figure out

Â what the leader vertex is? Well, you just traverse parent pointers, up until you

Â get to the root of that particular group. So for this implementation of the Find

Â operation, the worst case running time is just going to be the longest path of

Â parent pointers that you ever have to traverse, to get from an object to some

Â root. So the way we're going to quantify that

Â is using these ranks. So this is again, a field that we

Â maintain for each object. And for now, this will break down later,

Â but for now before we have path -pression, we're going to maintain the

Â invariant that the rank of an object x is exactly the largest number of pointers

Â you have to traverse, from some leaf, to get to x.

Â As a consequence, the biggest rank of any object is the longest path from any leaf

Â to any root. And that's going to be an upper bound on

Â the worst case running time of the find op- Operation.

Â So let's move on to the union operation. So here, given 2 objects, x and y, you

Â need to fuse their 2 trees, their 2 groups, so you find the roots of the 2

Â trees, so you call a find on x, you call a find on y.

Â That gives you their 2 respective roots, and now you install 1 as a new child.

Â Of the other. Now we saw in a quiz in the last video,

Â if you're not careful about which root you install as a child under the other,

Â you can wind up with these long chains. And be stuck with a linear worse case

Â time for both find and union. So instead we have this union by rank

Â optimization. Which says, well, we want to keep our

Â trees from getting scraggly. And the way we're going to do that is.

Â When we have a shallow tree and a deep tree we make the shallow tree shall under

Â the root of the deep one, that prevents the tree from getting even deeper.

Â Now there is a situation where the two trees have exactly the same depths that

Â is where the two roots have exactly the same rank, in that case we just proceed

Â arbitrarily. Then when we merge two trees that both

Â had a common rank r, its important that in the new tree, the rank is gone up to

Â r+1. So we need the update, we need the

Â incremental rank of the new root to reflect that increase.

Â So that's where we've already been. Where are we going to next? Well the plan

Â for this video is to show that, with the union by rank, optimization.

Â The maximum rank of any node, is always, bounded above, by log, base 2 (n).

Â Where n is the number of objects in the data structure.

Â Now we just said, the worst case running time of find, is governed, by the maximum

Â rank. So the logarithmic maximum rank means

Â logarithm run time of find. that also carries over to the union

Â operation. Remember union is just 2 finds plus

Â constant work to rewire 1 pointer, so that's going to give us algorithm time

Â value on both operations. So let's see why that's true.

Â So let's begin the analyses with a few simple, but useful properties that follow

Â immediately from our invariant. From the way that we change the ranks of

Â objects as we do finds and as we do unions.

Â So the first simple property is focus on your favorite object.

Â x. And, watch this objects rank change, over

Â the course of the data structure, as we do finds and unions.

Â How can it change? Well, when we do a find we don't change anything, all the

Â ranks stay the same. When we do a union, all the ranks stay

Â the same. Well, except there is 1 case in the

Â union, where the rank, of a single node, gets bumped up by 1, gets increased.

Â So ranks only go up, over time, for all of the Objects, that's property one.

Â So the second property is again pretty much trivial, but really, really useful.

Â So what is the situation in which somebody's rank gets bumped up by 1?

Â We're going to take the union of two trees that have a common rank.

Â And then whichever of the two roots that we pick to be the root of the new bigger

Â tree That's the object whose rank gets bumped up by 1.

Â So new roots of this fused tree. So in particular, the only type of

Â objects that can ever get a rank increase is a root.

Â If you're not a root, your rank will not go up.

Â Furthermore, once you're not a root in this data structure, you will never be a

Â root again in the future. There is no process by which, you shed

Â your parent. Once you have a parent other than

Â yourself, you will always have exactly that parent.

Â Putting those two observations together we find that, as soon as an object X.

Â Becomes a non root but as soon as it has a in parent other than itself it rank is

Â frozen for the rest of time forever more. The third and final simple property

Â follows from a formula we mentioned in the last video about computing ranks.

Â So remember the rank of a node in general is going to be one more than the maximum

Â rank of any of its children. So if you have a child and there is some

Â path from a leaf to that child, it takes 5 hops.

Â The path to you from that child is going to take 6 hops.

Â As a consequence as you go from the leaf up to the root you will see a strictly

Â increasing sequence of ranks. The rank of a parent is always strictly

Â more than the rank of all of those children.

Â So that's it for the immediate properties.

Â Let's go to a property which is a little less immediate.

Â But still this next lemma, which I'm going to call the rank lemma, it's the

Â best kind of lemma. So on the one hand, it's just not that

Â hard to prove. I'll give you a full proof in the

Â following 2 slides. On the other hand, it's really powerful.

Â It's going to play a crucial role in the analysis were doing right now.

Â A logarithmic run time bound, with a union by rank optimization, and we'll

Â keep using it again as a workforce, once we introduce path compression, and prove

Â better bounds on the operations. So what's the rank limit say? Well it

Â controls the population size of objects, that have a given (no period) Rank, so we

Â want it to apply at all intermediate stages of our data structure, so we're

Â going to consider an arbitrary sequence of unions.

Â You can throw in some finds as well. I don't care.

Â Finds don't change the data structure, so they're totally irrelevant, so think

Â about a sequence of unions, a sequence of mergers.

Â The claim is, for every non-negative integer, r.

Â The number of objects that have rank exactly r at this time is at must n.

Â the total number of objects divided by 2 to the r.

Â So for example, if our rank is 0. It says that at must n objects have rank

Â 0, so it is a trivial statement because only n objects.

Â But at any given time the number of objects that have rank 1 is at most n

Â over 2, the number of objects that have rank 2 is at most no over 4 and so on.

Â And if you think about it, if we succeed in proving the rank Lemma, we're pretty

Â much done, showing the efficacy of the union by rank optimization.

Â So in particular, once you take r, the, in this, key Lemma, to be log base 2 (n),

Â it says that there's at most 1 object. That has rank log2(n).

Â And there can't be any objects that have rank strictly larger.

Â That is, this limit implies that the maximum rank at all times is bounded

Â above by log2(n). And remember, the maximum rank is the

Â longest path of pointers, traversals, you ever need to get from a leaf to a root.

Â And that means the most amount of work we'll ever do in a find, and therefore,

Â in a union is O (log n) Okay, so I've now teased you with the consequences of the

Â rank lemma, assuming that it's true, but why is it true? Let's turn to the proof.

Â I'm going to break the proof down into two claims, claim one and claim two.

Â We'll see that the two claims easily imply the rank Rank Lemma.

Â So claim 1 asks you to consider 2 objects, X and Y, that have exactly the

Â same rank R. And the claim asserts that the sub-trees

Â of these 2 objects have to be disjoint. They have no objects in common.

Â And here by the sub-tree of an object, I just mean the other objects that can

Â reach this one by following a sequence. Of, parent pointers.

Â So that is the subtree at x, is the objects from which you can reach x.

Â The subtree at y is the objects from which you can reach y.

Â The second claim, is that, if you look at any object that has rank r, and you look

Â at it's subtree, that is, if you look at the number of objects that can reach,

Â this object x by following pointers, there have to be a lot of them.

Â There have to be at least 2 raised to that objects rank Are, objects in it's

Â subtree? Notice that, if we prove claim 1 and claim 2, then the Rank Lemma, follows

Â easily. Why? Well, fix a value for, r.

Â 2, 10, I don't care what. Look at all the nodes that have this rank

Â R. By claim 2, each of them has at least 2

Â to the R objects that could reach them. And by claim 1, these have to be disjoint

Â sets of objects. Well, there's only N objects to go

Â around, and if each of these disjoint sets has at least 2 to the R of them,

Â there are going to be at most N over 2 to the R such groups.

Â That is at most N over 2 to the R nodes, objects with this rank R.

Â So we've reduced the proof of the rank Lemma to proving claims 1 and 2, I will

Â do them in turn. So for claim 1 let me go via the contra

Â positive, that is, I will assume that the conclusion is false, and I will show that

Â the hypothesis must then also be false. So, lets assume that we have 2 no,

Â objects x and y, and their subtrees are not, disjoint.

Â That is, there exists an object z, from which You can reach X and from the same

Â object Z, you can also reach Y by a sequence parent pointers.

Â Well now let's use the fact that we're dealing with the directed tree, right, so

Â if you start with an object Z. There's only a unique parent point, or 2,

Â follow each time. So that is, all of the objects reachable

Â from z, they form a directed path, leading up to the root of z's group.

Â So the only way for both x and y to be reachable from z, they have to both be on

Â this path. If they're both on this path, then 1 has

Â to be an ancestor of the other. So now we're going to use the third of

Â our simple properties that we observed. That is, on every path to the root, ranks

Â strictly go up, each time. So, whichever of x or y is an ancestor of

Â the other, that has strictly higher rank. Therefore x and y do not have the same

Â rank. That completes the proof, of claim 1.

Â So lets move on to claim 2. Remember, claim 2 is search that, an

Â object of rank r, necessarily has 2 ^ r objects, or more, in its subtree.

Â That's how many objects can actually reach this object x, by following Parent

Â pointers. So for this proof we're going to proceed

Â by induction on the number of operations, and again remember fine operations have

Â no effect on the data structure, so we can ignore them.

Â So it's just by induction on the number of union operations that happen.

Â So for the base case, when, before we've done any unions whatsoever, we're doing

Â just fine. Every object has a rank of 0 and the

Â sub-tree size of every object is equal to 1.

Â That object itself, also known as 2 to the 0.

Â Zero. Now for the inductive step, there's an

Â easy case and a hard case. The easy case is where nobody's rank

Â changes, where we do a union, and everybody's rank stays exactly the same.

Â In this case, we're golden. Why? Well, when you do a union, sub-tree

Â sizes only go up. There's only more.

Â Pointers so there's only more objects that can reach any given other objects.

Â So sub-tree sizes go up, ranks stay the same.

Â If we had this inequality of sub-tree size as being at least 2 ^ r before, we

Â have it equally well now. Now.

Â So the interesting case is when somebody's ranked actually changes.

Â How can that happen? Well it happens in only one particular way that we

Â understand well. Looking at a union operation between

Â objects X and Y. Suppose the roots of these objects are S1

Â and S2 respectively. It's only when these.

Â Two roots have the same rank, let's call that common rank R, that somebodies rank

Â gets changed. In particular, we're going to break ties

Â as we did in the previous video. S2 will be the root of the fused tree, S1

Â will become a child of it. And, in that case, S2s rank gets bumped

Â up by 1. It goes from R To r + 1.

Â Now notice, in this case, we do have something to prove.

Â What are we trying to establish? We're trying to establish that every subtree

Â size is big, as a function of the rank. So, s2's rank has gone up, and therefore

Â the lowerbound, the bar that we have to meet, for the subtree size, has also Gone

Â up, it's doubled. So in this case we actually have to

Â scrutinize s2's new sub-tree. So what is its new sub-tree? Well, it's

Â really just composed from its old sub-tree, and it inherits s1 and all of

Â its sub-trees. Well, in that case, we know that s2's new

Â subtree size, the nubmer of no objects that can reach it, is just, it's old

Â subtree size, plus the old subtree size, of s1.

Â But then w're in good shape because we have the inductive hypothesis to rescue

Â us. So remember, before this union, by the

Â inductive hypothesis, for every object with a given rank, say r, it had at least

Â 2^r objects in its sub tree. So S1, and S2, both had rank r before

Â this. Unions, before this union, both of their

Â subtree were at least two to the r. So as two subtree sizes bounded below by

Â two to the r plus two to the r, a quantity also known as two raised to the

Â r plus one. Quite conveniently, r plus one is S two's

Â new rank, so S two's new bigger rank, its subtree size is still meeting the lower

Â bound, meeting the target of two raised to, New rank, 2 ^ r + 1.

Â So that completes the inductive step, therefore it completes the proof of claim

Â 2, that objects of rank r have subtree sizes at least 2 ^ r.

Â Therefore completes the proof of the rank Lemma, that for every rank r, there's at

Â most n / 2 ^ r nodes of rank r. And remember the rank Lemma, implies that

Â the maximum rank, at all times, is bounded by log base 2 (n), as long as

Â you're using union by rank. And that implies, that with this first

Â optimization, the worst case running time of union, and find, are both O (log n),

Â where n is the number of objects, in the data structure.

Â Coursera provides universal access to the worldâ€™s best education,
partnering with top universities and organizations to offer courses online.