5

Let $A$ be a matrix from $\mathbb{M}_{n \times n}(F)$ and $f(x) \in F[x]$.

How does one prove the following: $f(A)$ is invertible iff $\gcd(Ma,f)=1$ where $Ma$ is the minimal polynomial of $A$.

Thanks.

Jonas Meyer
  • 53,602

1 Answers1

6

If $\gcd(Ma,f)=p$ then there are polynomials $g$, $h$ such that $g\,Ma + h\,f=p$. If we plug $A$ into this equation, we get $h(A)f(A)=p(A)$. If $p=1$ we thus get that $f(A)$ is invertible.

If $p\neq 1$ then (as $p$ divides $f$) $f=pq$ for some polynomial $q$, i.e. $f(A)=p(A)q(A)$. If $f(A)$ is invertible then so must be $p(A)$ and $q(A)$. Let $Ma=pr$, then $0=Ma(A)=p(A)r(A)$, and as $p(A)$ is invertible, $r(A)=0$, hence $Ma$ was not minimal.

user8268
  • 21,348
  • In general, you cannot just "plug in" matrices into polynomials; the evaluation map is not a ring homomorphism for noncommutative rings (that is, if f(t)=g(t)h(t), it is not generally true that f(a)=g(a)h(a) when a is an element of a noncommutative ring). You need to be very careful when you talk about "plugging in" matrices. See http://math.stackexchange.com/questions/4437/is-the-proof-of-this-lemma-really-necessary/4443#4443 You need explicitly note that $M$ commutes with the coefficients and with itself, so that you can work in $F[A]$, which is commutative. – Arturo Magidin Mar 26 '11 at 20:28
  • @user8268: thank you very much. It was very helpful. –  Mar 26 '11 at 21:54
  • @AM: I would rather say that what you write is necessary e.g. for the definition of minimal polynomial. Without it the statement of the problem cannot be understood (it even contains $f(A)$, i.e. "$A$ plugged into $f$"). That's why I don't think it should be a part of the solution. – user8268 Mar 26 '11 at 22:00
  • 3
    @user8268: You misunderstand. It is perfeclty fine to "plug in" matrices into polynomials. That's not a problem. The problem arises when you start with a polynomial identity like $f(x) = p(x)q(x)+r(x)$, and then you claim that from this identity it follows that $f(A) = p(A)q(A)+r(A)$. Both sides makes sense, but it is not a given that they evaluate to the same thing when things are not all commutative. For example, over the quaternions, you have that $x^2+1 = (x-i)(x+i)$. And you can certainly evaluate both $x^2+1$ and $(x-i)(x+i)$ at $x=j$. But the results you get are not equal. – Arturo Magidin Mar 26 '11 at 23:03
  • @user8268: (cont). So to go from "$gM_a + hf = p$" to "$g(A)M_a(A) + h(A)f(A) = p(A)$" you need to justify it somehow; the equality of polynomials does not guarantee the equality of evaluations a priori. You can compute both sides, but you cannot simply jump to the two sides being equal when dealing with non-commutative things (like matrices). – Arturo Magidin Mar 26 '11 at 23:05
  • @user8268: (cont) Basically, the argument you use can be used to give a notoriously fallacious "proof" of the Cayley-Hamilton Theorem: "since the characteristic polynomial of $A$ is $\det(A-tI)$, then plugging in $A$ for $t$ gives $\det(A-A) = \det(0)=0$." But that proof is invalid. – Arturo Magidin Mar 26 '11 at 23:15
  • @Arturo: I was perhaps not clear enough: you need to know the same thing if you want to prove that minimal polynomial exists. – user8268 Mar 27 '11 at 06:26
  • @user8268: Depends on exactly how you prove it exists, but if you want to use the obvious route, then yes, you do; still worth mentioning that here, because all the matrices that occur commute with one another (scalar matrices and powers of $A$), there is no problem in this instance, in my humble opinion. (There's a way of defining the minimal polynomial that does not require it, by making $V$ into an $F[x]$ module directly). – Arturo Magidin Mar 27 '11 at 18:23
  • @Arturo: you are certainly right. It's just that I prefer concise solutions (which is not always a good thing). Anyway, sorry for this long discussion. – user8268 Mar 27 '11 at 20:44
  • @user8268: No problem. – Arturo Magidin Mar 27 '11 at 21:25