0:00

In the following series of videos, we'll give a formal treatment of asymptotic

notation, in particular big-Oh notation, as well as work through a number of examples.

Big-Oh notation concerns functions defined on the positive integers, we'll call it T(n)

We'll pretty much always have the same semantics for T(n). We're gonna be

concerned about the worst-case running time of an algorithm, as a function of the

input size, n. So, the question I wanna answer for you in the rest of this video,

is, what does it mean when we say a function, T(n), is big-Oh of f(n). Or

hear f(n) is some basic function, like for example n log n. So I'll give you a

number of answers, a number of ways of, to think about what big-Oh notation really

means. For starters let's begin with an English definition. What does it mean for

a function to be big-Oh of f(n)? It means eventually, for all sufficiently large

values of n, it's bounded above by a constant multiple of f(n). Let's think

about it in a couple other ways. So next I'm gonna translate this English

definition into picture and then I'll translate it into formal mathematics. So

pictorially you can imagine that perhaps we have T(n) denoted by this blue

functions here. And perhaps f(n) is denoted by this green function here, which

lies below T(n). But when we double f(n), we get a function that eventually

crosses T(n) and forevermore is larger than it. So in this event, we would say

that T(n) indeed is a Big-Oh of f(n). The reason being that for all sufficiently

large n, and once we go far enough out right on this graph, indeed, a constant

multiple times of f(n), twice f(n), is an upper bound of T(n).

So finally, let me give you a actual mathematical definition that you could use

to do formal proofs. So how do we say, in mathematics, that eventually it should be

bounded above by a constant multiple of f(n)? We see that there exists two

constants, which I'll call c and n0. So that T(n) is no more than c times f(n)

for all n that exceed or equal n0. So, the role of these two constants is to

quantify what we mean by a constant multiple, and what we mean by sufficiently

large, in the English definition. c obviously quantifies the constant multiple

of f(n), and n0 is quantifying sufficiently large, that's the threshold

beyond which we insist that, c times f(n) is an upper-bound on T(n). So, going

back to the picture, what are c and n0? Well, c, of course, is just going to

be two. And n0 is the crossing point. So we get to where two f(n). And T(n)

cross, and then we drop the acentode. This would be the relative value of n0

in this picture, so that's the formal definition, the way to prove that

something's bigger of f(n) you exhibit these two constants c and n0

and it better be the case that for all n at least n0, c times f(n) upper-bounds T(n).

One way to think about it if you're trying to establish that something is big-Oh of

some function it's like you're playing a game against an opponent and you want to

prove that. This inequality here holds and your opponent must show that it doesn't

hold for sufficiently large n you have to go first your job is to pick a strategy in

the form of a constant c and a constant n0 and your opponent is then allowed to

pick any number n larger than n0 so the function is big-Oh of f(n) if and only if you

have a winning strategy in this game. If you can up front commit to constants c and

n0 so that no matter how big of an n your opponent picks, this inequality holds

if you have no winning strategy then it's not big-Oh of f(n) no matter what C and n0

you choose your opponent can always flip this in equality. By choosing a

suitable, suitable large value of n. I want to emphasis one last thing which is

that these constants, what do I mean by constants. I mean they are independent of

n. And so when you apply this definition, and you choose your constant c and

n0, it better be that n does not appear anywhere. So C should just be something

like a thousand or a million. Some constant independent of n. So those are a

bunch of way to think about big-Oh notation. In English, you wanna have it

bound above for sufficiently large numbers n. I'm showing you how to translate that

into mathematics that give you a pictorial representation. And also sort of a game

theoretic way to think about it. Now, let's move on to a video that explores a

number of examples.