4

I found that my 2 textbooks and other answers state this theorem for commutative ring:

  1. From textbook Algebra by Saunders MacLane and Garrett Birkhoff.

enter image description here

  1. From textbook Analysis 1 by Herbert Amann and Joachim Escher.

enter image description here

  1. In this answer, @Bill Dubuque also states for commutative ring.

For polynomials over any commutative coefficient ring, the high-school polynomial long division algorithm works to divide with remainder by any monic polynomial...

  1. In this answer, @Bill Dubuque also states for commutative ring.

Yes, your intuition correct: the Polynomial Factor Theorem works over any commutative ring since we can always divide (with remainder) by a polynomial that is monic i.e. lead coef $=1$ (or any unit = invertible element). Ditto for the equivalent Polynomial Remainder Theorem - see below.

I have re-read the proofs in my 2 textbooks and can not found where the commutativity is used. As such, is commutativity needed in the proof of division algorithm?

Akira
  • 17,367
  • 1
    It's a bit more complex for non-commutative ring of coefficients: you have a Euclidean division on the right and another on the left. . In the same way you have left and right evaluations of a polynomial at an element of the ring. – Bernard Aug 30 '20 at 20:50
  • 1
    If you're interested in noncommutative generalizations then a nice application is (Ore) skew polynomials used for working with differential and difference operators (recurrences). For an introduction the paper cited here. – Bill Dubuque Aug 30 '20 at 21:35

1 Answers1

3

If $K$ is noncommutative there's a bit of ambiguity in what you mean by $K[x]$; if you mean that $x$ is central, then I suppose the argument goes through fine, but I'm not aware of any applications of it.

Note that it does not follow that polynomials over $K[x]$ have unique factorization. As an explicit example, if $K = \mathbb{H}$ is the quaternions, then the polynomial $x^2 + 1$ admits inequivalent factorizations

$$x^2 + 1 = (x + i)(x - i) = (x + j)(x - j) = (x + k)(x - k).$$

The problem is that because we need $x$ to be central, we can't substitute any non-central element for $x$! So the usual argument where we substitute a root no longer applies.

Qiaochu Yuan
  • 419,620
  • Let $\mathbb N = {0,1,\ldots}$. I thought that the element $x:\mathbb N \to K$ is widely defined such that $x_n = 1$ if $n = 1$ and $x_n = 0$ otherwise. As such, $x$ is central. – Akira Aug 30 '20 at 20:49
  • My point is that if $K$ is noncommutative then you might want $K[x]$ to refer to the free $K$-algebra on one generator; if you want $K[x]$ to mean the familiar collection of sums of the form $\sum c_n x^n, c_n \in K$ then this is the free $K$-algebra on a central generator, which is different. The free $K$-algebra on one generator is considerably less nice, but on the other hand $K[x]$ (with $x$ central) doesn't admit an evaluation homomorphism except for central elements. – Qiaochu Yuan Aug 30 '20 at 20:50
  • Please check if I understand correctly. The fact that $c$ is central in $K$ is essential in showing that $E_c:K[x] \to K, \sum_n p_n x^n \mapsto \sum_n p_n c^n$ preserves multiplicative structure. Let $\overline f \in K^K$ be the polynomial function induced by $f \in K[x]$. In proving $x-a|f \implies \overline f (a)=0$, we use division algorithm to get $f=g(x-a)+r$. Then we get $\overline {g(x-a)}(a) = \overline{g} (a) \cdot \overline{x-a} (a)$. The last equality (from homomorphism property) only holds if $a$ is central in $K$. – Akira Aug 30 '20 at 21:14
  • Yes, that's correct. – Qiaochu Yuan Aug 30 '20 at 21:14
  • Thank you so much for opening my mind. – Akira Aug 30 '20 at 21:15
  • You're very welcome! – Qiaochu Yuan Aug 30 '20 at 21:15