0:00

Okay, so we, we did a merge sort we showed how merge sort worked, and

Â then we analyzed its running time, and then we got it these three currents.

Â Okay, so this tells us that the running time of of merge sort for a sorting a list

Â of size n is twice the running time of merge sort on a list of size n over 2.

Â Plus some over head for merging the two sorted halves of the list.

Â This over head in this case, in the case of merge sort, it's linear, okay.

Â So taken two list of size n over 2,

Â to merge them, it's going to take a time that is linear in the size of the input.

Â Now when we said that for the base case when we have a list of size 1.

Â I have constant amount of work to do.

Â And as we said you know this is not the type of function explicit function,

Â that allows us to understand the running time.

Â I cannot easily plot this, or I cannot easily imagine how fast or

Â slow this is going to grow.

Â So the question is how do we go from this?

Â To some explicit function.

Â How do we get from this to the function,

Â that the running time of merge sort is really O of n log n?

Â So, dealing with recurrences in general,

Â this is a recurrence that we are seeing from merge sort,

Â but, recurrences are not only from merge sort or for this kind of algorithms.

Â Recurrences are a very broad mathematical tool for describing functions.

Â Right? So

Â sometimes we describe the function explicitly.

Â We say, f of n is 2n.

Â But, sometimes we describe a function in this kind of form.

Â We say, f of n equals f of n minus 1, plus f of n minus 2.

Â All right? So, we describe the the function in

Â a recursive way or in a recurrence rather then in an explicit form, but,

Â again as I say it we would like to understand what this function is so

Â that we understand how fast or slow the the my, merge sort is.

Â 1:50

Fortunately for us, the algorithms that we are going to see that are going to do

Â divide and conquer have this form, have this form of a recurrence.

Â Where we take a problem, we divide it into two sub problems, each of equal size,

Â n over 2, and we solve 2 of the sub problems, or

Â a certain number of the sub problems.

Â Plus sometime for the, for the merging process when we have a recurrence of this,

Â the nice thing about this type of recurrence is is that there's already

Â a mathematical tool that will allow us to go from here to an explicit function.

Â 2:27

Very simple, in a very simple manner, okay?

Â And that mathematical tool is called the Master Theorem.

Â And this brings us to the, the Master Theorem, which is written here.

Â So Mater Theorem applies to recurrences of this form.

Â What is this form, how do we relate it to algorithms?

Â Again, think about an algorithm, that I want to solve a problem of size n.

Â I divide the problem into b sub problems, each of the sides n over b.

Â I divide the problem into b sub problems, each of size n over b,

Â of equal, size, I solve a of them, I solve a of them.

Â Okay, so a cannot be greater than b for

Â example because I'm already dividing the problem into b sub problems.

Â I can divide, I can solve one, two, three, four or all b of them.

Â Plus, I have some time, f of n, for the merger, for

Â merging the sub problem, the solutions of sub problem.

Â In addition, I have also, constant time for, for the,

Â for dealing with the simplest cases or the base case.

Â So for these general forms of, of a recurrence.

Â For this general form of a recurrence, I have the Master Theorem.

Â Notice here that i'm saying a is equal to or greater than 1, b is greater than or

Â equal to 2, c is greater than 0.

Â These are not something to memorize, these should be values that make sense.

Â Remember that i'm saying a is the number of sub problems that I solve,

Â in the case of merge sort, i'm solving both of the sub problems.

Â Here, I'm saying, in general, I'm solving a certain,

Â a certain number of sub problems.

Â What type of an algorithm, what kind of an algorithm, and

Â what kind of a problem would it be if I'm solving 0 sub problems?

Â I divide the problem into two sub problems, and I solve none of them.

Â 4:25

We need to divide the problem into sub problems.

Â If b is not greater than or equal to 2, the only other option is that b equals 1.

Â I'm taking this up, the big problem, I'm dividing it into one big problem and

Â I keep dividing it into one big problem.

Â What type of a divide and conquer is that?

Â 4:42

Of course we have to divide it into at least two sub problems.

Â Two is not the magic number.

Â You can divide the problem into four sub problems.

Â The merge sort you could have taken the original list.

Â And divided it into four sub lists of equal size.

Â You could have taken the original list, and divided it into 100 sub lists, and

Â repeat the same process.

Â Okay?

Â But, here b is the number of sub-problems, that we,

Â that we divide the original problem into.

Â So it has to be at least 2.

Â We have to divide the original problem into at least 2.

Â Notice that I'm not.

Â Allowing much much room for, for playing with the sizes of the sub problem.

Â I'm saying if you divide your problem into b sub problems,

Â all of them should be almost of equal size, n over b.

Â Of course in practice, doesn't have to be exactly equal.

Â I, I illustrated merge sort on a list of size eight.

Â And I chose eight on, on, on purpose because eight is a nice number.

Â It's going to divide into four and four.

Â Four is going to divide into two and

Â two, and then two is going to divide in one and one.

Â But, if I have a list of size, of size 13, I have to divide it into six and

Â seven, and the seven into four and three, and the three into two and one.

Â So there not going to always divide into two list, of exactly equal size.

Â But, this would be almost equal size, okay?

Â So this why we divided into b sub problems of equal length, or almost equal length.

Â We solve a of them, we have a of n for

Â the merger and, and we have constant time for dealing with simplest case of problem.

Â The Master Theorem stays for this type of recurrence.

Â For this type of recurrence, and if the time to do the merger is polynomial in n,

Â that can include the constant, because if d equals this is 0 of 1,

Â if d equal 1 this is linear, if d equal 2 this is quadratic and so on.

Â The Master's Theorem says, if you have a recurrence of this form.

Â 6:44

Then we can say that T of n can be

Â one of these three explicit functions T of n is either all of n to the d, or

Â all of n to the d times log n, or all of n to the log of a to the base b.

Â Depending on these three conditions.

Â So you know you have the values of a and b and d.

Â Right?

Â We have the values of a and b and d.

Â We can check, if a is smaller than b to the d or equal, or greater.

Â Depending on the value of this condition, you choose the one from here.

Â And this is the master Theorem.

Â Of course it's not an axiom of life, that this holds.

Â It is not a magic result, there is a proof for this.

Â But the nice thing, about having it with a proof is that once it is proved that

Â is correct, we can use it as a tool, so that we simplify our life and

Â make it much easier by taking a recurrence like this and

Â in a very straightforward manner, getting the actual function by just comparing.

Â Two numbers, a and b to the d.

Â Now let's look, let's look at the Master Theorem,

Â let's look at the recurrence that we got from our sort.

Â First of all, we have to make sure that the recurrence we

Â got is amenable to the Master Theorem, okay?

Â Which means, do, do the, does the recurrence we get.

Â Does it have this form?

Â The answer is yes.

Â We have a equal 2, b equal 2, f of n is of n, and c equal of, c is basically 1.

Â Is f of n of n to the d?

Â The answer is yes, in the case of merge sort, n to the, to the first d is 1,

Â so in the case of merge sort.

Â 9:23

And this is why merge sort takes n log n time, okay?

Â So this, in this case, we have s, in,

Â in this, in this example of merge sort we have seen divide and conquer.

Â How we can take a problem, divided it into sub problems, conquer each one of them.

Â And solve the big problem.

Â We saw that such an algorithm is, is a naturally, implemented,

Â in a recursive manner, so that when you divide the problem into sub problems,

Â recursively called original functional on the two sub problems.

Â We saw that for recursive algorithms,

Â we look at running time, what naturally arises is a recurrence.

Â And the fourth thing we saw is that for

Â this type of recurrence that we are seeing.

Â Instead of dealing with the full blown power and

Â complexity of recurrences and making our life hard,

Â we have a special form of recurrences that are amenable to a very nice result to

Â the application of very nice result that's called the Master Theorem.

Â That without much thought I can just compare two values here, a and

Â b to the d, and get the actual function.

Â So I get, I go from merge sort to this recurrence,

Â from this recurrence through the master theorem to this value.

Â Merge sort takes o of n log n.

Â Remember that our naive algorithm, which takes elements and

Â copies them to a different list is O of n squared.

Â O of n log n is better than O of n squared.

Â Merge sort is O of n log n.

Â 10:45

The question is, can we still do better than this?

Â Notice that merge sort, what it does.

Â It compares elements, right?

Â When we are doing the merging, it's comparing elements.

Â So this falls into a category of sorting algorithm,

Â that is called Comparison Based Algorithm.

Â It's an algorithm.

Â That when it's sorting it has to compare pairs of elements, okay?

Â The, one of the powerful mathematical results in the area of algorithms and

Â in the area of sorting, is that this is the best we can do.

Â If we are, if we employ comparison based sorting approach.

Â So any, if you look at any algorithm for sorting a list of size n.

Â Without putting any assumptions or

Â constraints about the elements in the array.

Â 11:31

If it's doing comparison the algorithm the sorting algorithm cannot do n log n,

Â if you see a sorting algorithm in the literature who's running time is linear,

Â it takes O of n then it cannot be a comparison based algorithm.

Â Okay? Or if can be wrong, of course.

Â But if we assume it is correct, it cannot be a comparison based algorithm, okay?

Â So for comparison based algorithms, algorithms that are comparing.

Â Like the first one we showed O of n squared.

Â Where we always [COUGH] went through the list,

Â comparing elements until we got the minimum.

Â That was n squared.

Â This is O of n log n.

Â We cannot do better than this, as we,

Â as long as we are sticking to the comparison based sorting algorithm.

Â Okay?

Â So sorting, merge sort takes over n log n.

Â And master theorem allowed that to go from the recurrence to this explicit function,

Â that we understand how it grows.

Â And this grows much better than O of n squared.

Â