5

The origin of my question is the formula for the error of a Taylor polynomial. e.g. if $P_n(x)$ is the $n$th order Taylor Polynomial of $f(x)$ centered at $c$ then

$f(x) - P_n(x) = \frac{f^{(n+1)}(\xi)}{(n+1)!}(x-c)^{n+1}. $

A natural assertion is that the error decreases as we increase $n$ since $(n+1)!$ get's larger and we divide by that. Ignoring that if $(x-c)$ is small that $(x-c)^{n+1}$ very small as well, can we be sure that the derivative of $f$ won't grow even faster? That is, I'd just like to consider

$\lim_{n\rightarrow \infty}\frac{f^{(n+1)}(\xi)}{(n+1)!} $

I tried some calculations with

$ f(x) = e^{e^x}$

It seems like certainly $f^{(n)} \ge (e^x)^ne^{e^x}$ so then if I look at, let's say the term from $n-1$ T.P. so we have $n$ rather than $n+1$ and holding $x$ fixed (also note that $x$ should be $\xi$ but $\xi$ was almost unreadably in $e^{e^\xi}$)

$\lim_{n\rightarrow \infty}\frac{ (e^x)^ne^{e^x}}{n!} $

Applying L'hopitals and using the Gamma function for derivatives of factorials (like this question Derivative of a factorial) it looks like

$\lim_{n\rightarrow \infty}\frac{ n(e^x)^{n-1}e^{e^x}}{(n-1)!C_1} $

where C is that constant $(-\gamma+H_n)$ like in the referenced question. So it seems like the numerator will win out as we keep taking derivatives.

Is this reasoning right? I'm partly concerned because my intuition would be that a T.P. should nicely approach a nice smooth $C^\infty$ function like $e^{e^x}$, but this seems like a counter example, though admittedly I am ignoring the $(x-c)^n$ factor.

Fractal20
  • 1,479
  • 1
  • 13
  • 30
  • Maybe I amissing something, but if (as in your example) $f$ extends to an entire function, then by elementary theory of functions of a complex variable, you have $f(z) = \sum_{n=0}^\infty \frac{f^{(n)}(x_0)}{n!} (z-x_0)^n$ for all $z\in\Bbb{C}$. Since the series converges for all $z$, we get in particular (take $z$ so that $|z-x_0|=1$) $f^{(n)}(x_0)/n! \to 0$ as $n\to \infty$ for all $x_0$. See also this https://math.stackexchange.com/questions/114349/how-is-cauchys-estimate-derived – PhoemueX Feb 02 '18 at 21:41
  • 2
    In general this is exactly what happens for smooth non-analytic functions. Mere smoothness without analyticity allows you to arbitrarily apply Taylor's theorem...but it can happen that the resulting error bound deteriorates as you proceed to higher orders. For example, the Maclaurin series of $f(x)=e^{-1/x}$ for $x>0$ and $f(x)=0$ otherwise is just zero. Taylor's theorem still holds for approximating $f(1)$ using this series, but it gives useless bounds because of huge values of the high order derivatives. – Ian Feb 02 '18 at 21:47
  • Ah, I guess that is what I was looking for. Looking up real analytic function on wikipedia gives a definition that is pretty much the Taylor Series if the constants were replaced with the derivatives. Great, thanks to both of you. Either of you want to put it in an answer so I can accept it? – Fractal20 Feb 02 '18 at 22:20

2 Answers2

2

In general this is exactly what happens for smooth non-analytic functions. Mere smoothness without analyticity allows you to arbitrarily apply Taylor's theorem...but it can happen that the resulting error bound deteriorates as you proceed to higher orders. For example, the Maclaurin series of

$$f(x)=\begin{cases} e^{−1/x} & x>0 \\ 0 & x \leq 0 \end{cases}$$

is just zero. Taylor's theorem still holds for approximating $f(1)=e^{-1}$ using this series, but it gives useless bounds because of huge values of the high order derivatives.

Ian
  • 101,645
1

Hint :Probably the best example for smooth non analytic function is the Solution of this differential equation $f'=e^{f^{-1}}$ which presented as a formel power series as shown here and as noted here then you can apply Taylor theorem you will see that it's derivative grow faster then it's Taylor polynomial error (The coefficients of the formel series is grow faster) .