5

Let $f:[-a,a] \rightarrow \mathbb{R}$ be a function assuming derivatives up to the $n$-th order in the open interval $(-a,a)$. The Taylor polynomial of $f$ around $0$ is:

$$P_{n}(x) = \sum_{k=0}^{n} \frac{f^{(k)}(0)}{k!}x^k$$

For example, if $f(t)=e^t$, then:

$$P_{n}(t) = 1 + t + \frac{t^2}{2} + \frac{t^3}{3!} + ... + \frac{t^n}{n!}$$

It's possible to substitute $t \rightarrow -x^2$ to get:

$$P_{n}(-x^2) = 1 - x^2 + \frac{x^4}{2} - \frac{x^6}{3!} + ... + (-1)^n \frac{x^{2n}}{n!}$$

And the primitive of this polynomial could be used to approximate $\int e^{-x^2}dx$

However, if we had $g(x) = e^{\sin x}$, the third order Taylor polynomial would be

$$P_{3}(\sin x) = 1 + x + \frac{x^2}{2}$$ (Note: the third derivative calculated in $0$ is equal to $0$)

And obviously, replacing $t$ by $\sin x$ wouldn't work here.

It may sound a silly question, but when I got to this result I started asking what's formally a substitution and when it's possible to change variables and when it's not.

El Cid
  • 1,209

1 Answers1

3

The general rule is that you can compute the Taylor polynomial of $f(g(x))$ by substitution whenever $g(0)=0$. This condition insures that the high-order terms in the Taylor-expansion for $f$ don't contribute any low-order terms in the expansion for $g$, so you can feel free to ignore them — in fact, when $f$ and $g$ are analytic, you can do the computation even with formal power series, without worrying about convergence at all.

So, as $\sin 0=0$, you actually can compute the Taylor polynomials of $e^{\sin x}$ by substitution. For example, the third-order polynomial is given by

\begin{align} e^{\sin x} &\approx 1 + \sin x + \frac{\sin^2 x}{2!} + \frac{\sin^3 x}{3!}\\ &\approx 1 + \left(x - \frac{x^3}{3!}\right) + \frac{1}{2}\left(x - \frac{x^3}{3!}\right)^2+\frac{1}{6}\left(x - \frac{x^3}{3!}\right)^3\\ &\approx 1 + x + \frac{1}{2}x^2 \end{align} (see this WolframAlpha link to verify the last claim: go to the "expanded form" section and ignore every power of $x$ greater than $3$).

But a similar computation for $e^{\cos x}$ would fail: as $\cos 0 \neq 0$, you need all the terms of $e^t$'s Taylor-expansion to even compute $e^{\cos 0}$, let alone the derivatives of $e^{\cos x}$ at zero.

Micah
  • 38,108
  • 15
  • 85
  • 133