2

The assertion in the subject line is an abbreviated form of:

$$\sum_{i=1}^n \alpha_i x^i = 0, \forall \; x \in {\mathbb R} \;\; \Longrightarrow \;\; \alpha_i = 0, \forall\;i \in \{1, \dots , n\}.$$

In words: if for all real $x$, the sum $\sum_{i=1}^n \alpha_i x^i$ is always $0$, then all the coefficients $\alpha_i$ must be $0$.

I can't readily come up with a formal proof of this, though it seems to me "obviously" true: it's an uncountable number of equations against a finite number of unknowns.

A pointer to a nice, illuminating proof would be appreciated.

I often come across unproven assertions that I accept on the basis of exactly that same informal argument ("more equations than degrees of freedom"), but I'd like to be able to think about this stuff a bit more rigorously. So what I'm really after is an approach/technique for proving assertions about "overdetermined" systems of this kind.1

Needless to say, an "elementary" proof, based only on basic algebra, is far preferable to one relying on some other non-obvious result.

Also, since the informal argument given above does not depend on all the properties of ${\mathbb R}$, one expects that the assertion would remain true if we replace ${\mathbb R}$ with a more general (though still infinite) structure. An elementary proof of the theorem in its full generality would be really nice too. I realize that such a proof may run over many pages, hence I don't ask for a proof necessarily, but rather for a pointer to one.

BTW, I didn't know how to Google for this proof because I could not come up with adequate search terms. Does this theorem have a standard name?


1An example of what I mean here would be the common trick of using the sequence $\{2^{-i}\}_{i=1}^\infty$, to construct proofs requiring a non-infinite sum of infinitely many terms. Or Cantor's diagonalization trick. Once one learns them, tricks like these become second-nature, and turn some previously challenging proofs into routine ones. I'm hoping to find the trick that makes proving the assertion in the subject line routine.

kjo
  • 14,334

5 Answers5

3

$\textbf{Hint}:$ Consider the polynomial $P(t)=\alpha _n t^n + \cdots + \alpha _1t$.

Your hypothesis says that every real number $x$ is a root of this polynomial.

Proceed.

Git Gud
  • 31,356
2

In any field $K$, a non-zero polynomial $p(x) \in K[x]$ of degree $n$ can have at most $n$ roots. Your polynomial has infinite roots. So, it must be the zero polynomial.

Mohan
  • 14,856
  • The OP specifically asked for just a tip. – Git Gud Feb 02 '13 at 20:36
  • @GitGud: sorry that I was not sufficiently clear: what I meant was that, in case that an elementary proof would be too long to write out (which I thought likely), a pointer to such a proof would suffice. – kjo Feb 02 '13 at 20:47
  • @kjo No problem. The important thing is that you're happy with the answers. – Git Gud Feb 02 '13 at 20:49
2

Hint $\rm\ f(\Bbb R)=0\:\Rightarrow\:f(\Bbb N) = 0.\:$ Suppose that there exists a nonzero polynomial with $\rm\,f(\Bbb N) = 0.\:$ Choose one of minimal degree. Then $\rm\:f(0)=0\:$ so $\rm\:f(x) = x\,g(x).\:$ For $\rm\,n\ge 1\!:\,$ $\rm\,f(n) = n g(n) = 0\:$ so $\rm\:g(n) = 0.\:$ Hence $\rm\:g(x\!+\!1)\:$ is zero on $\,\Bbb N,\:$ and $\rm\,deg\ g < deg\ f,\,$ contra minimality of $\rm\,f(x).$

Remark $\ $ Alternatively, one can work over $\rm\,\Bbb R\,$ and use the continuity of polynomials to deduce that the polynomial $\rm\,g(x) = f(x)/x\,$ has $\rm\:g(0) = 0,\,$ e.g. see this thread: Apostol proof divides by zero? See my remarks there for how this is related to the factor theorem.

Generally a polynomial $\ne 0$ with coefficients in a field has no more roots than its degree. For more general rings this fails. It is true iff the coefficient ring is a domain, i.e. $\rm\:cd = 0\:\Rightarrow\:c=0\ or\ d=0.\:$ Note that the above proof used this property: $\rm\: n\ne 0,\ n\,g(n)=0\:\Rightarrow\:g(n)=0.\:$ For non-domain failures note $\rm\:2x = 0\:$ has $2$ roots $\rm\:x = 2,0\:$ mod $4$, and $\rm\:x^2 = 1\:$ has $4$ roots $\rm\:x = \pm1,\pm3\:$ mod $8$.

Math Gems
  • 19,574
  • @Math Gems Why didn't you stay in $\mathbb{R}$? – Git Gud Feb 02 '13 at 21:51
  • @GitGud Because the essence of the matter, that a polynomial has no more roots than its degree, has nothing to do with $,\Bbb R:$ (or continuity). Rather, it depends only on the ability to cancel nonzero coefficients, i.e. that the coefficient ring is a domain (e.g. a field). – Math Gems Feb 02 '13 at 21:58
  • @MathGems Yes, so why didn't you stay in $\mathbb{R}$? – Git Gud Feb 02 '13 at 21:59
  • @Git Gud The proof nolonger works with $\mathbb{R}$ since we lose $g(0)=0$. It works for $\mathbb{R}^*$. – Julien Feb 02 '13 at 22:00
  • @julien The same way Math Gems took $n\ge 1$, he can take $x\ge 1$ and everything works just the same. – Git Gud Feb 02 '13 at 22:02
  • @GitGud The first proof works over $,\Bbb R,$ too! The only advantage of using continuity in the linked Apostol proof is that it eliminates the need to shift $\rm,x \to x!+!1:$ to get the roots of $\rm:g(x):$ from all $\rm:n\ge 1:$ to all $\rm:n\ge 0.\ \ $ – Math Gems Feb 02 '13 at 22:04
  • @Git Gud Yes, that's why I mentioned $\mathbb{R}^$. But indeed $[1,+\infty)$ works equally well. So $\mathbb{N}$, $\mathbb{R}^$, $[1,+\infty)$... I don't see more here than a matter of taste. As for taste, I believe the most elementary proof of this result is by differentiating and evaluating at $0$ repeatedly. – Julien Feb 02 '13 at 22:07
  • @Julien Yes, that's another standard proof (which requires knowledge of calculus - which the algebraic proofs do not). – Math Gems Feb 02 '13 at 22:20
  • @MathGems Your proof does not, which is why I like it actually better than mine. But for those who use the Fundamental Theorem of Algebra, there is quite a bit of calculus hidden under the rug... – Julien Feb 02 '13 at 22:32
  • @MathGems: I really like your proof, but I don't share your typesetting tastes... I find text like "Then $\rm:f(0)=0:$ so $\rm:f(x) = x,g(x).:$ For $\rm,n\ge 1!:,$ $\rm,f(n) = n g(n) = 0:$ so $\rm:g(n) = 0.:$" noticeably harder to read than "Then $f(0)=0$ so $f(x) = x,g(x).$ For $n \ge 1!:,f(n) = n,g(n) = 0$ so $g(n) = 0.$" – kjo Feb 03 '13 at 02:20
2

Hint If $x_1,..,x_n$ are pairwise distinct the following determinant is non-zero:

$$\det\pmatrix{1 &x_1 &x_1^2 &..&x_1^{n-1}\\1 &x_2 &x_2^2 &..&x_2^{n-1}\\... &... &.. &..&.\\1 &x_n &x_n^2 &..&x_n^{n-1}}\neq 0$$

If you don't see why this happen, I can post the closed formula for that determinant.

N. S.
  • 132,525
1

Assume that the $a_k$'s are not all equal to $0$. Let $m$ be the smallest integer such that $a_m\neq 0$.

Dividing the initial equation by $x^m$ when $x\neq 0$, we get $$ a_m+a_{m+1}x+\ldots+a_{n}x^{n-m}=0\qquad\forall x\neq 0. $$

Now letting $x\rightarrow 0$, we find $a_m=0$, a contradiction.

So the $a_k$'s are all $0$.

Julien
  • 44,791