2

Why isn't $\frac{1}{x}$ a polynomial?

Does it directly follow from definition? As far as I know, polynomials in $F$ are expressions of the form $\sum_{i=0}^{n} a_ix^i$, where $a_i\in F$ and $x$ is a symbol.

Or is there a nicer argument involved?

Footnote: $F$ is a field of characteristic zero.

Erick Wong
  • 25,198
  • 3
  • 37
  • 91
Shweta Aggrawal
  • 5,501
  • 2
  • 15
  • 48
  • 7
    For one, there's a negative power on $x$, which your own definition forbids. Do you want to prove that it is impossible to rewrite it as a polynomial? – Randall Feb 05 '19 at 03:08
  • @Randall Actually I was trying to show that $R[x]$ is not a field. I want to prove that it is impossible to rewrite it as a polynomial. – Shweta Aggrawal Feb 05 '19 at 03:10
  • 3
    If $R[x]$ is a field, then $x$ is invertible. Write out $xf(x)=1$ and get a contradiction by comparing coefficients. Done. – Randall Feb 05 '19 at 03:12
  • 2
    @Randall If I evaluate the above expression at $0$ I get $0=1$ a contradiction!! – Shweta Aggrawal Feb 05 '19 at 03:14
  • It looks like you are dealing with the problem of which functions are expressible as a polynomial in a SINGLE variable. But you have written $x_i$ are symbols which seem to suggest you are dealing with many variables. Is this a typo, typing $x_i$ instead of $x^i$? Please clarify. – P Vanchinathan Feb 05 '19 at 03:15
  • Sorry for the typo. I have edited. I am dealing with $F[x]$ @PVanchinathan – Shweta Aggrawal Feb 05 '19 at 03:16
  • 1
    Hint $\ \rm x , f(x) = 1 ,$ in $,\rm R[x]\ \Rightarrow \ 0 = 1 , $ in $,\rm R, , $ by evaluating at $\rm\ x = 0.\ $ For more see the thread Why can't the polynomial ring be a field? – Bill Dubuque Feb 05 '19 at 03:18

5 Answers5

6

If you want a more formal "proof", you can suppose for contradiction that $1/x$ is in fact equal to some expression $a_k x^n$ of the form $ \sum_{k=0}^n a_k x^n $:

$$ \frac{1}{x} = a_0 + a_1 x + \cdots + a_n x^n$$

multiplying through by $x$ gives:

$$ 1 = a_0 x + a_1 x^2 + \cdots + a_n x^{n+1}$$

Setting $x=0$ gives: $$ 1 = a_0 \cdot 0 + \cdots + a_n \cdot 0^{n+1} = 0$$

a contradiction, so we must have that $1/x$ is not a polynomial.

Szeto
  • 11,159
ASKASK
  • 9,000
  • 4
  • 28
  • 49
  • 2
    I have a small question about the logic of this argument, since the first equation seems to prohibit evaluation at $x = 0$. Comments? Cheers! – Robert Lewis Feb 05 '19 at 03:43
  • @RobertLewis if two polynomials are equal everywhere except one point, then they must also be equal at that point. So technically, you could say: $1/x = a_0 + \ldots$ everywhere except $x=0$, so therefore $1 = a_0x + \ldots$ everywhere except x=0, but since the two sides of this equation are both polynomials, equality everywhere except a single point implies equality everywhere. – ASKASK Feb 05 '19 at 05:36
  • 1
    If you wanted to be even more direct about it, you could take the limit as x goes to zero of both sides of the equation $1/x = a_0 + \ldots$ and note that the right hand side tends to $a_0$ but the left hand side tends to infinity, a contradiction. (Of course this argument only works if $F=\mathbb R$ or $\mathbb C$) – ASKASK Feb 05 '19 at 05:40
4

The short answer, as pointed out by Randall in his comment, is that polynomials are by definition sums of terms of the form $ax^k$ where $k \ge 0$; since $x^{-1}$ is not of this type, it is not polynomial. This actually covers the case of formally defined polynomials

$p(x) \in F[x], \tag 1$

since there is no term of the form $x^{-1} \in F[x]$ according to the conventional definition, which only addresses non-negative powers of $x$.

Perhaps a somewhat more subtle question is whether, as a function, $x^{-1}$ may be expressed an element of $F[x]$; that is, can we ever have

$x^{-1} = p(x) = \displaystyle \sum_0^n p_i x^i \in F[x], \; p_i \in F, \; 0 \le p_i \le n? \tag 2$

the usual understanding of this equation, as an equivalence of functions, is that

$\forall 0 \ne a \in F, \; a^{-1} = p(a). \tag 3$

Under the hypothesis that

$\text{char}(F) = 0 \tag 4$

we may rule (3) out as follows: it is equivalent to

$\forall 0 \ne a \in F, \; ap(a) = 1, \tag 5$

which in fact asserts that every $0 \ne a \in F$ is a root of the polynomial

$xp(x) - 1 = \displaystyle \sum_0^n p_i x^{i + 1} - 1; \tag 6$

we have

$\deg(xp(x) - 1) = n + 1; \tag 7$

as such, $xp(x) - 1$ has at most $n + 1$ zeroes in $F$; but (4) implies that $F$ contains a copy of the rationals $\Bbb Q$ as an infinite subfield; every non-zero element of $\Bbb Q$ must thus satisfy (6), and hence a reduction to absurdity is attained. Therefore, (2) cannot be the case. $OE\Delta.$

Robert Lewis
  • 71,180
1

Note that: $$\frac{1}{x}=x^{-1}$$

Going off of the wikipedia definition of a polynomial found here:

In mathematics, a polynomial is an expression consisting of variables (also called indeterminates) and coefficients, that involves only the operations of addition, subtraction, multiplication, and non-negative integer exponents of variables.

It is easy to see that our expression fails to meet the criteria of being a polynomial due to the fact that its variable contains a negative exponent.

Gnumbertester
  • 1,609
  • 1
  • 8
  • 22
1

We can also show the result ignoring the usual construction of $F(X)$ and instead using (only) the universal property definition:

The polynomial ring $F[X]$ is a ring $P$, together with a ring homomorphism $i\colon F\to P$ and a special element $X\in P$, such that for all rings $A$, ring homomorphisms $f\colon F\to A$, and elements $a\in A$, there exists one and only one ring homomorphism $h\colon P\to A$ with $h\circ i=f$ and $h(X)=a$.

Now assume there exists $u\in P$ such that $uX=1$ (or, if we do not demand unital rings, just $uX=i(e)$ for some $e\ne0$). Consider $A=F$, $a=0$, $f=\operatorname{id}_F$. By the universal property, there exists $h\colon P\to F$ such that $h\circ i=\operatorname{id}_F$ and $h(X)=0$. Then $$e=h(i(e))=h(uX)=h(u)h(X)=h(u)0=0,$$ a contradiction.

J. W. Tanner
  • 60,406
  • This is essentially the same argument I gave in an answer in the thread I linked yesterday in a comment on the question. The comments (and surprisingly large number of upvotes) there seems to indicate that this universal viewpoint comes as a surprise to many. Likely this implies that this viewpoint probably gets less exposure than it deserves at undergrad level. – Bill Dubuque Feb 05 '19 at 16:34
-1

You can first use intuition from polynomials where coefficients are from the real numbers. Being every where continuous the image of $[-1,1]$ is a compact set, which is bounded for a polynomial. But the image of the same set under $f(x)= \frac1x$ is unbounded. So we can see that in reals, $1/x$ is not a polynomial.

Now take any field of characteristic 0. One can see that if $\alpha$ is any root of $g(x)$ then it has to be a root of the product polynomial $f(x)g(x)$. Now assuming $f(x)=\frac1x$ is a polynomial multiply with the second polynomial $g(x)=x(x-1)$. Their product is the polynomial $(x-1)$, this product has just 1 root namely $1$ whereas $g(x)$ has two roots, 0 and 1. This contradiction should settle the issue.

  • To the person who downvoted: any reason for downvoting? if the answer is outright wrong then there is no need to give explanation. I have given valid points. In case i made some subtle error some explanation will help both the OP and me if you point it out. – P Vanchinathan Feb 05 '19 at 05:33