Calculus Two: Sequences and Series is an introduction to sequences, infinite series, convergence tests, and Taylor series. The course emphasizes not just getting answers, but asking the question "why is this true?"

Loading...

From the course by The Ohio State University

Calculus Two: Sequences and Series

982 ratings

Calculus Two: Sequences and Series is an introduction to sequences, infinite series, convergence tests, and Taylor series. The course emphasizes not just getting answers, but asking the question "why is this true?"

From the lesson

Taylor Series

In this last module, we introduce Taylor series. Instead of starting with a power series and finding a nice description of the function it represents, we will start with a function, and try to find a power series for it. There is no guarantee of success! But incredibly, many of our favorite functions will have power series representations. Sometimes dreams come true. Like many dreams, much will be left unsaid. I hope this brief introduction to Taylor series whets your appetite to learn more calculus.

- Jim Fowler, PhDProfessor

Mathematics

Analyticity.

[MUSIC]

Suppose f is a function and suppose that I

can compute the derivative of f of 0 the second

derivative of f of 0, the third derivative of

f of 0, the fourth derivative of f of 0.

And so on.

Suppose that I can compute the nth derivative

of f at 0 regardless of what n is.

So in short, suppose that f is infinitely differentiable at 0, meaning that although

might be really hard, atleast in principle

I could compute the millionth derivative at 0.

The billionth derivative at 0, I can compute any

higher derivative that I want, at the point 0.

Well then the power series for f around 0 is the sum n

goes from 0 to infinity of the nth derivative of f at 0.

This is a thing that makes sense, because f is infinitely differentiable.

Divided by n factorial times x to the n. But what I don't know, is whether that

power series converges to f, when x is near but not equal to zero.

What I'm asking is whether this Taylor

series is actually equal to the function, right?

What's the relationship between this Taylor series

and the original function that I'm studying?

Sometimes it happens that the function is equal to its Taylor series.

So let's give a name to

that phenomenon. Here's how we're going to talk about this.

The function f, which I'm assuming to

be infinitely differentiable, is said to be real

analytic at the point zero if there's some

number big R positive, so that this happens.

So that the function is equal to it's Taylor series.

Maybe not everywhere.

But at least when x is within big R of zero.

I can make this a bit more general.

Let's think for what this is really talking about, Right.

What this definition is trying to get at is the idea

that the function is equal to its tailor series around zero.

That's what real analytic at zero means.

It means the function has a power series representation around zero.

When you generalize this and you talk about power series around other points.

Alright.

So, here's the definition of what it means for a function to

be real analytic, not just at zero, but real analytic at some arbitrary

point A.

Well, it should still mean that there's some big R which will

measure how for x is from a now, instead of from zero.

And here I've written down the Taylor series around zero,

but I should write down the Taylor series around a.

So, instead of differentiating at zero, I'll differentiate f n times and

evaluate at a, and then I'll multiply that not just by x.

But by x minus a to the nth power.

And then I don't want to say that x is close to zero.

I want to say that x is within big R of a.

So this is saying that at least near a, when you're within big R of a.

F can be written as a power series.

And that's what it means to say that f is real analytic at a point a.

Let me draw a diagram to try to convey, you know, what is going on here.

Let's suppose that this piece of paper represents

all of the functions from the real numbers to the real numbers.

And plenty of those functions are just miserable functions.

Here's a graph of a terrible looking

function, that's got a lot of discontinuities.

Some of these functions though are continuous.

So within the collection of all functions, I've got the continuous functions.

Here's a graph of what looks like to be a continuous function.

Still not great, because it's still got these spikes but at least it's continuous.

Now some of these continuous functions are differentiable.

A function as differentiable is necessarily continuous.

So within the collection of all continuous functions, is a smaller

collection just a differentiable functions,

they don't have these terrible spikes.

But just because you're differentiable, doesn't mean that

you're say, twice differentiable or three times differentiable.

So within the collection of all the differentiable functions,

there's an even collection of functions which are infinitely differentiable.

These are functions that I can differentiate once,

twice, three times.

These are functions I can differentiate as many times as I like.

Sometimes people call these functions smooth functions or C infinity functions.

This is a very restrictive class of functions.

But there's an even more restrictive class of functions.

Within the smooth functions, there's a smaller collection of functions.

The real analytic functions. These functions aren't just smooth.

Right?

It's not just that I can differentiate these functions.

These functions also

have the property that if I write down their tailor series around

some point, that tailor series converges near that point to the function.

So these real analytic functions are really quite special.

I mean, not every function is real analytic.

And yet the surprise is that so many of the functions

that we care the most about turn out to be real analytic.

[NOISE]

[SOUND]

Coursera provides universal access to the world’s best education,
partnering with top universities and organizations to offer courses online.