My professor determine the Taylor expansion of degree 2 of some function $f(x, y, z) = \cos(g(x, y, z))$ around $(x, y, z) = (0, 0, 0)$ by first stating the one dimensional Taylor expansion of $\cos(x)$, i.e. \begin{align*} \cos\left(x\right)= 1 - \frac{x^{2}}{2} + O(x^{4}) .\end{align*} Then he simply plugged the new arguments in, i.e. \begin{align*} T_{2}f(0, 0, 0) = 1 - \frac{1}{2}g(x, y, z)^{2} + O\left(g(x, y, z)^{4}\right) .\end{align*} Is this even allowed? Why can you use the one dimensional taylor expansion in such a way?
-
Hm, yeah, that is not right in general, but is $g$ arbitrary or something specific where the equality might holds? – Manatee Pink Jan 21 '22 at 18:17
-
@ManateePink could you give me a counterexample? – Richard Jan 21 '22 at 19:51
-
For example, $cos(xy) $. Its Taylor Expansion around $(0,0)$ doesn't have a quadratic term, but by the approach of your professor, you would have one. – Manatee Pink Jan 21 '22 at 20:37
-
In general, the Taylor Expansion of a multivariable function not only has powers of the variable in it, but also mixed terms, which the approach of your professor wouldn't necessarily have, again the example with $cos(xy) $ – Manatee Pink Jan 21 '22 at 20:50
-
@ManateePink but the method of my professor for $\cos(xy)$ yields $1 - 1/2 \cdot (xy)^2$ whereby one evaluates $x$ and $y$ at zero, which yields $1$, as expected – Richard Jan 21 '22 at 22:13
-
No, that isn't consistent. The expansion has already been made around 0. Your variables are then variables, you can of course evaluate them at 0, but they don't have to be. Like sure, if you plug in 0 for x or y, the quadratic term vanishes as well, but that is always the case. In the proper expansion the quadratic term is zero for any x and y. – Manatee Pink Jan 21 '22 at 22:20
-
Either that or you haven't given the full information on your professor's method. – Manatee Pink Jan 21 '22 at 22:22
-
1I believe it is correct if $g(x, y, z) \to 0$ as $(x, y, z) \to 0$. – Mason Jan 22 '22 at 01:27
-
@Mason could you elaborate why or point me to a source that proves this fact? – Richard Jan 22 '22 at 10:13
1 Answers
Write out what the one dimensional Taylor theorem says more explicitly: $$\cos(x) = 1 - \frac{x^2}{2} + R(x), \hspace{20pt}R(x) = O(x^4) \text{ as }x \to 0.$$ Here $R(x) = O(x^4)$ as $x \to 0$ means that there is a constant $C > 0$ and some $\delta > 0$ such that for $0 < |x| < \delta$, $|\frac{R(x)}{x^4}| \leq C$. In other words, $|\frac{R(x)}{x^4}| \leq C$ for $|x|$ small.
I assume $g : \mathbb{R}^3 \to \mathbb{R}$. Given $y \in \mathbb{R}^3$, you can plug in $g(y)$ for $x$ in the formula for $\cos(x)$ to get $$\cos(g(y)) = 1 - \frac{g(y)^2}{2} + R(g(y)).$$ Now presumably, what your professor is claiming is that $R(g(y)) = O(g(y)^4)$ as $y \to 0$. So he is claiming that $|\frac{R(g(y))}{g(y)^4}| \leq C$ for $|y|$ small. We know that $|\frac{R(g(y))}{g(y)^4}| \leq C$ when $|g(y)| < \delta$, but we don't know whether we can make $|g(y)| < \delta$ by making $y$ small. One condition we can put on $g$ that allows this to happen is to assume $g(y) \to 0$ as $y \to 0$.

- 10,415
-
thanks! what I was missing on is that the taylor series is unique, so they undoubtedly must be the same. – Richard Jan 22 '22 at 21:39
-
@Richard "Taylor series is unique" is not really true. If $g$ is differentiable, then you can do a true Taylor expansion for $f$ using the multivariable form of Taylor's theorem. But here all you've done is plug in $g(y)$ for $x$ in $\cos(x)$. – Mason Jan 22 '22 at 22:11
-
Doesn't https://math.stackexchange.com/questions/1923624/taylors-polynomial-uniqueness-proof-why-are-these-limits-inferable hold? – Richard Jan 24 '22 at 12:01
-
because I checked multiple examples and the above method always seems to work – Richard Jan 24 '22 at 12:02
-
@Richard That doesn't apply here since $y \mapsto 1 - \frac{g(y)^2}{2}$ is not a polynomial. This is a function on $\mathbb{R}^3$. If you Taylor expand $\cos(g(y))$, you will get a different result. Try it with $g(y) = y_1y_2$. – Mason Jan 24 '22 at 18:08