1

Let $f(X)\in R[X]$. Show that $r$ is a root of $f(X)$ if and only if $f(X)\in (X − r)$.

Assume $r$ is a root of $f(X)$: $f(r)=0$ for some $r\in R$. How do we prove that $f(r)=(X − r)^n$ for some $n$?

Assume $f(X)\in (X − r)$: $f(X) = (X − r)^n$ for some $n$. How do we prove $f(r)=0$?

Xam
  • 6,119
PolkaDot
  • 191

1 Answers1

0

Suppose $f(x)$ is in the ideal generated by $x-r$; this just means $f(x)=g(x)(x-r)$ for some polinomial $g(x)$. Then simple substitution shows that $f(r)=g(r)(r-r)=g(r)\cdot 0=0$.

On the other hand, if $f(r)=0$, divide $f(x)$ by $x-r$ (which you can always do because $X-r$ is monic) to get a quotient $q(x)$ and a remainder $m(x)$ such that $$f(x)=q(x)(x-r)+m(x).$$ Since it must be $\deg(m(x))<\deg(x-r)=1$, actually $m(x)$ must be a constant $m$ (could be $0$). Hence $$f(x)=q(x)(x-r)+m.$$ Now substitute $r$ in the previous identity to get $f(r)=q(r)(r-r)+m=q(r)\cdot 0+m = m$; but $f(r)=0$ by hypothesis, hence $m=0$ and therefore $$f(x)=q(x)(x-r),$$ i.e., $f(x)$ is in the ideal generated by $x-r$.

Jose Brox
  • 4,856