Taylor Polynomials

Suppose we have a function , with derivatives. How could we approximately match it?

Well, one potential way of doing this might be by matching derivatives at a point ! If we can capture ’s entire state at a point, we may be able to approximate the entire function, or at least approximate nearby points.

We say two functions and have contact of order at if

Example: Contact of Order

Let , . These two functions have a contact of order at , as

With Taylor Polynomials, we are trying to give and the highest contact of order we can! This has a ton of practical applications in the real world.

Notation: Kronecker Delta

For convenience, we will define the notation

Theorem: Taylor Polynomial

Let be an open interval containing , and fix .

Then, if is -times differentiable, there exists a unique Taylor Polynomial such that and have contact of order at .

What if we extended this for ? That is, given , is it true that

This is possible, but only in some cases. This is where the remainder theorem comes in!

For any function , we can write it as

Where is its remainder term. The following theorem defines this remainder term for us.

Theorem: Lagrange Remainder Theorem

Fix , and let . Let be times differentiable. Then, , there such that

Note that depends on and . Thus, changing or may give us a different .

Then, for to equal it’s Taylor Series, we need this remainder to drop to 0 as !

When does this remainder term converge to 0?

Notice that in the denominator, we have a factorial , whereas in the numerator we have a power .

Theorem: Factorials vs Powers

As seen above, factorials grow faster than powers - so for convergence, we need our numerator term to be a power or slower! So, if we assume our function grows no faster than a power function,

For some , then we obtain convergence of our remainder term.

This is given in the below theorem.

Theorem: Convergence of Taylor Polynomial

Let , and let . Suppose such that

Note that this condition is sufficient, but not necessary for all cases of convergence.

Then, is equal to its Taylor series on .

Example: Convergence of Taylor Polynomial

Convergence occurs because every derivative of is bounded between and . In other words,

Where , .

Analytic Functions

We say is (real)-analytic if , and such that

For , and some . In other words, all points in the function has a localized power series expansion!

Note that if is analytic, then it is smooth (). However, the converse of this is false! See the below example.

Example: Analytic Function Counterexample

Consider the bump function (mollifier)

Where is chosen to make the integral equal to 1.

This is a smooth function, but it doesn’t have a globally convergent Taylor series, nor is it analytic!

Theorem: Weierstrass Approximation Theorem

Let . Then, , there exists a polynomial of degree such that ,

Note that can be a very high degree!

This theorem is very strong, but needs all of the clauses! See the below counterexample.

Example

Suppose is not compact. Then, we can form a counterexample to this theorem with on !

Suppose for all . Then, we get a counterexample as for any non-zero polynomial, it will eventually go to or , so it cannot be bounded!

In other words,

So as , this forces to go to , but this is not possible for a non-zero polynomial!