5

I know that a function can admitted multiple series representation (according to Eugene Catalan), but I wonder if there is a proof for the fact that each analytic function has only one unique Taylor series representation. I know that Taylor series are defined by derivatives of increasing order. A function has one and only one unique derivative. So can this fact be employed to prove that each function only has one Taylor series representation?

4 Answers4

5

Well its possible for e.g. $f(x) = \sum a_n x^n = \sum b_n (x-1)^n$ simultaneously, but that probably isn't what you meant. Instead lets just consider the behavior at one point, say expanding around $x=0$.

Let's fix notation-

A "power series" (at $x=0$) is any series formally defined by $\sum_{n=0}^\infty a_n x^n$. A "Taylor series" (at $x=0$) for a smooth (i.e. $C^\infty$) function $f$ is the power series formally defined by $\sum_{n=0}^\infty \frac{f^{(n)}(0)}{n!} x^n$.

So any function that is infinitely differentiable (at $x=0$) has a unique Taylor series at 0 [note that the Taylor series may not converge, and if it converges, it may not converge to $f$]. But I think you are trying to ask if any "analytic function" (a term I haven't defined yet) is equal at each point to a unique power series, which is the Taylor series. You can first prove the following result, which allows you to define the concept of "analytic functions"-

Theorem 1. Any power series $\sum_{n=0}^\infty a_n x^n $ that converges at one $x_0$ where $|x_0|=\rho>0$, converges absolutely and locally uniformly on the set $|x|<\rho $, where it defines a $C^\infty$ function $F(x) := \sum_{n=0}^\infty a_n x^n$, and $ a_n = \frac{F^{(n)}(0) }{n!}$.

In particular, the power series is the Taylor series of $F$. An "analytic function" (near $x=0$) is defined to be any such function $F$ that can be obtained in this way (i.e. an analytic function is a $C^\infty$ function locally equal to a convergent power series, its Taylor series.)

Suppose now that we have $\lim_{N\to\infty}\sum_{n=0}^N a_n x^n = 0$ for $|x|<r$. Then I claim that $a_n = 0$ for all $n$, proving the uniqueness of convergent power series for $f(x) = 0$. This immediately follows from Theorem 1 above, which allows us to talk of the function $F(x) := \sum_{n=0}^\infty a_n x^n$. But by hypothesis, $F$ is actually the zero function, so we have $a_n = \frac{F^{(n)}(0) }{n!} = 0$.

This implies the uniqueness of convergent power series (at $0$) for any analytic function; for if there were two different ones, their difference would be a nonzero convergent power series equal to 0, which doesn't exist.

I'll sketch the proof of the main result (Theorem 1). We have convergence at $x=x_0$ where $|x_0|=\rho$. Let $0<r<\rho$. Then note that we have (from $\sum_{n=0}^\infty d_n $ exists implies $ d_n \to 0$) $$a_n x_0^n \xrightarrow[n\to\infty]{} 0 \implies |a_n| |x_0|^n = |a_n|\rho^n \xrightarrow[n\to\infty]{} 0.$$ In particular there exists $M>0$ such that $|a_n| \rho^n < M$ for all $n$. Therefore for any $x$ such that $|x|\le r$, by Geometric Series formula, since $\left(\frac r{\rho} \right)<1$, $$ |a_n x|^n \le |a_n | r^n = |a_n | \rho^n \left(\frac r{\rho} \right)^n \le M \left(\frac r{\rho} \right)^n, \quad\sum_{n=0}^\infty M \left(\frac r{\rho} \right)^n < \infty. $$ So by the Weierstrass M-test, in fact the series converges absolutely and uniformly (and therefore pointwise) on the closed disk $|x|\le r$. It therefore defines a function, which we call $F(x)$.

If the series can be differentiated term-by-term, then a standard induction argument proves that $a_n = F^{(n)}(0)/n!$. Formally differentiating once, we formally obtain the series $\sum_{n=1}^\infty n a_n x^{n-1} = \sum_{n=0}^\infty (n+1) a_{n+1} x^n$. Now note that for $|x|\le r<\rho$, $$ |(n+1) |a_{n+1}| x^{n}| \le (n+1) |a_{n+1}| r^{n} \le (n+1) M \left(\frac{r}{\rho}\right)^n \le CM \left(\sqrt{\frac{r}{\rho}}\right)^{n}, \\ \sum_{n=0}^\infty CM \left(\sqrt{\frac{r}{\rho}}\right)^{n} < \infty$$ since there exists $C>0$ such that $n+1 < C \left(\frac{\rho}r\right)^{n/2}$ for all $n$. By Weierstrass M-test, the formal series obtained by term-by-term differentiation converges absolutely and uniformly to some function $G$ on $|x|\le r$, which implies that $F$ is differentiable with $F'=G$. This argument is repeatable (using instead $n^k < C_k \left(\frac{\rho}r\right)^{n/2}$), proving by induction that $F$ is $C^\infty$, and validating the result $ a_n = F^{(n)}(0)/n!$ .

Calvin Khor
  • 34,903
3

You can prove that a power series is differentiable on the interior of interval of convergence, with derivative is obtained by differentiating term by term. So, you can conclude that the coefficient of $x^n$ must be $\frac{f^{(n)}(0)}{n!}$. So, coefficients are determined uniquely. So, the Taylor series is unique.

11101
  • 179
2

I think this simple proof is sufficient. I'm going to do it in two cases, but really the first case is a special case of the second.

Suppose a function $f(x)$ has two taylor series representations.

$$f(x)=\sum a_n x^n$$

$$f(x) = \sum b_n x^n$$

we know that $f(x) - f(x) = 0$, so just plug in each of the representations

$$f(x) - f(x) = \sum b_n x^n - \sum a_n x^n = 0$$

$$\sum (b_n-a_n) x^n = 0$$

The only way we can get 0 is if the coefficients are separately equal, since there is no cancellation, in general for all x, for monomials of different degree.

$$b_n-a_n = 0 $$ $$b_n =a_n $$

Now suppose we center the series at different points for each representation, i.e.

$$f(x)=\sum a_n (x-a)^n$$

$$f(x) = \sum b_n (x-b)^n$$

The binomial theorem is helpful here

$$f(x)=\sum a_n (x-a)^n = \sum a_n\sum\binom{n}{k}a^{n-k}x^k =\sum a'_kx^k $$

so $a'_k$ is just a new constant. The same will happen with the other representation, just set $a$ to $b$, and you will get again that

$$b'_k =a'_k$$

So the Taylor series representation is unique.

  • "The monomials are linearly independent, so the coefficients all have to be separately equal" this is wrong. Linear algebra doesn't apply here. We are dealing with an infinite series, not a finite linear equation. – freakish Sep 09 '20 at 07:08
  • You're incorrect, linear algebra most definitely applies here. You're obviously not familiar with monomials being a basis for the space of polynomials. An infinite basis in this case. As with all vectors, you add them component-wise.

    I don't even need to go that far though. There is no other way to get cancellation for all values of x without comparing the coefficients of the same monomial.

    – Φίλ λιπ Sep 10 '20 at 08:46
  • A basis may be infinite. But summation has to be finite. Infinite sum (what we are dealing with here) is just a fancy name for a limit of partial sums. Pure linear algebra doesn't deal with limits. Just because the sequence of partial sums converges to 0 doesn't mean that those partial sums are (eventually) 0. And this is what you are trying to apply here. – freakish Sep 10 '20 at 09:25
  • And your example is correct: monomials ${x^n}_{n=0}^\infty$ form a (linear) basis for the space of all polynomials. That's exactly what I'm saying: every polynomial is a finite linear combination of monomials. But we are not dealing with polynomials, we are dealing with analytic functions. And monomials do not form a basis for the space of all analytic functions. This is simply incorrect. – freakish Sep 10 '20 at 09:36
  • I don't see how this is correct. An Analytic Function is defined by having a locally convergent power series, which is determined by an infinite sum of monomials and real coefficients. This, in effect, makes every Analytic Function a polynomial with infinite degree within some domain of convergence. It can be written as a linear combination of monomials, and thus the basis is as I've said.

    You're going to have to show me either that Analytic Functions have a different basis and cannot be formed by a linear combination of monomials in general by counter example

    – Φίλ λιπ Sep 11 '20 at 13:59
  • In terms of finite summation, I have yet to see in the definition of Basis that the sum has to be finite. It only has to span the space and it has to be linearly independent. The basis of monomials is perfectly fine for that when describing the space of Taylor Series representations of Analytic functions. – Φίλ λιπ Sep 11 '20 at 14:06
  • You want to see the definition of basis where summation is finite? Here you go: https://en.wikipedia.org/wiki/Basis_(linear_algebra) The first sentence. If you see infinite sum in an algebraic context then this is only when all but finitely many coefficients is $0$. What do you think an infinite sum is, in an arbitrary vector space over arbitrary field (let me note that you didn't use any analytical property in your answer, pure linear algebra)? Concrete, say I take $k=\mathbb{Z}_2$ and $V=k$ as vector space. What is the result of adding $1$ to itself infinitely many times? Ridiculous. – freakish Sep 11 '20 at 14:17
  • And there is no such thing as polynomials of infinite degree. And even if it was (i.e. formal power series) monomials do not form basis of that. Really, dude, grab any book to linear algebra and read it few times, because I've just lost patience here. – freakish Sep 11 '20 at 14:18
  • Oh, and one last thing. Your definition of analytic function is wrong. "Power series determined by infinite sum of monomials" is just mumbling. That's a circular definition at best. This: "$\sum_{n=1}^\infty c_nx^n$" is just a symbol for this: "$\lim_{m\to\infty}\sum_{n=1}^m c_nx^n$". You claim that if $\lim_{m\to\infty}\sum_{n=1}^m c_nx^n=0$ then all $c_n=0$ with pure algebra, without paying attention to metric/topology. Again, ridiculous. In fact I can change topology (e.g. to antidiscrete) to make that claim false. And yet your "algebraic proof" would still "apply". – freakish Sep 12 '20 at 07:04
  • You need to relax. If you want to cite wikipedia then you can look at the definition of an Analytic Function for yourself: https://en.wikipedia.org/wiki/Analytic_function

    On the other hand, sure wikipedia says "(finite)" but there is such a thing as a Hilbert Space, which has a countably infinite set of basis vectors.

    Your example is irrelevant because I asked about Analytic Functions.

    Here is an example of a book that does recognize infinite basis: https://www.math.ubc.ca/~carrell/NB.pdf

    How it's different

    https://en.wikipedia.org/wiki/Linear_independence#Infinite_dimensions

    – Φίλ λιπ Sep 12 '20 at 18:46
  • No, dude, you need to start reading with understanding and learning maths. For the last time: just because a basis is infinite it doesn't mean that you are allowed to do infinite summing. Don't talk about Hilbert spaces when clearly you don't understand basics of linear algebra. – freakish Sep 12 '20 at 18:54
  • You either completely misunderstand the topic or at best confuse linear basis with Schauder basis because a Hilbert space cannot have countable linear dimension. Yes, Schuader basis is applicable here but this is even more sophisticated machinery, beyond your skills. – freakish Sep 12 '20 at 18:55
  • Also don't quote things without reading. Your wiki example: "The family is linearly dependent over $K$ if there exists a non-empty finite subset $J\subseteq I$ (...)". Go back to school and don't waste my time anymore. – freakish Sep 12 '20 at 18:57
  • As far as I'm concerned, your frustration is you taking this personally. Your inability to explain to someone you deem as less knowledgeable than you is not really my problem. I'm open to a source explaining the issues and nuances with this, but just going "oh well Wikipedia says so" is quite emblematic of keyboard warriors now a days. If you want to teach, then go for it and be patient. If you don't want to teach and just show me "how wrong I am" then you're just gonna keep getting mad. – Φίλ λιπ Sep 12 '20 at 19:00
  • You are right, this is not your problem. But you failing to understand where you are wrong (and basics of both algebra and analysis) is actually not my problem. It was unwise of me to engage in such a pointless discussion. And so I'm done. Good luck. – freakish Sep 12 '20 at 19:18
  • I'm definitely going back to my books which seemed to encourage solutions to partial differential equations by using the exact method you're taking issue with. In the context of a space convergent infinite power series how it is wrong to say that the monomials are linearly independent? Maybe it's the language? Are you claiming there's an example where in the set of taylor series representations of analytic functions, there's an example where $$\sum(a_n - b_n)x^n = 0$$ does not imply $$a_n - b_n=0$$? That maybe there's some weird combination that involves monomials of different degree? – Φίλ λιπ Sep 12 '20 at 19:54
  • In addition, I took your advice and went back to my Mathematical Physics book and they speak specifically about L2 spaces, which are Hilbert Spaces with countably infinite dimension. Maybe everything about orthonormal bases I learned in physics was wrong? Or maybe there's some semantics between mathematicians and physicists that needs to be sorted out. – Φίλ λιπ Sep 12 '20 at 19:56
  • The equation $\sum (a_n-b_n)x^n=0$ does imply $a_n-b_n=0$. It is your proof that is incorrect, not the claim. Orthonormal basis is the third kind of basis. It is neither linear nor Schauder. But it doesn't apply here because the space of all analytic functions is not a Hilbert space. And again: it is a machinery far beyond pure linear algebra. – freakish Sep 12 '20 at 21:07
  • Plus even if you embed analytic functions into $L^2$ space of square integrable functions (which only makes sense for analytic functions with finite radius of convergence, already big restriction) then monomials are never orthogonal: https://math.stackexchange.com/questions/1311993/is-there-a-representation-of-an-inner-product-where-monomials-are-orthogonal/1312003 – freakish Sep 12 '20 at 21:53
  • Yeah that makes sense. If the taylor series has an infinite radius of convergence then orthogonality would be indeterminate. Despite all the heat in the comments here, I do truly appreciate the insight you've provided for me, so thank you. I'll be looking deeper into this stuff and be more cautious when I provide a "proof." – Φίλ λιπ Sep 13 '20 at 23:08
2

The Taylor series is indeed uniquely defined for any smooth function, regardless whether it is convergent or not and whether it coincides with the function when convergent. And so asking about uniqueness is a bit pointless. It's like asking about uniqueness of derivative. However the question can be turned into a sensible one, if we ask whether $f(x)$ can be represented as a power series uniquely, i.e. if $\sum a_n(x-x_0)^n$ and $\sum b_n(x-x_0)^n$ are both convergent and equal over some open interval, then does it follow that $a_n=b_n$ for any $n$?

This can be reduced (by subtracting) to the question that if $\sum c_n(x-x_0)^n=0$ over some open interval then does $c_n=0$ follow?

Now assume that $\sum c_n(x-x_0)^n=0$ over some open interval $(a, b)$ with $x_0\in (a,b)$. Since every power series evaluates to $c_0$ at $x=x_0$ then we conclude that $c_0=0$. Thus we can write our equation as

$$(x-x_0)\cdot\big(c_1+c_2(x-x_0)+c_3(x-x_0)^2+\cdots+c_n(x-x_0)^{n-1}+\cdots\big)=0$$

It is tempting to multiply both sides by $(x-x_0)^{-1}$ and conclude that $c_1=0$ (and so by induction $c_n=0$) but we cannot do that for $x=x_0$. And actually we are only interested in $x=x_0$ case. Nevertheless we can do that for $x\neq x_0$. And so we conclude that

$$c_1+c_2(x-x_0)+c_3(x-x_0)^2+\cdots+c_n(x-x_0)^{n-1}+\cdots=0$$

for any $x\in (a,b)\backslash\{x_0\}$. Of course every power series is convergent at $x=x_0$, the question is whether it is $0$ there? And it is, because every power series is continuous (as a function of $x$) wherever it is convergent (see this). This implies that $c_1+c_2(x-x_0)+c_3(x-x_0)^2+\cdots=0$ for $x=x_0$ as well. And therefore $c_1=0$ by evaluating at $x=0$.

Now we repeat this process and by simple induction we conclude that $c_n=0$ for any $n\in\mathbb{N}$.

freakish
  • 42,851