33

It's a simple exercise to show that two similar matrices has the same eigenvalues and eigenvectors (my favorite way is noting that they represent the same linear transformation in different bases).

However, to show that two matrices has the same characteristic polynomial it does not suffice to show that they have the same eigenvalues and eigenvectors - one needs to say something smart about the algebraic multiplicities of the eigenvalues. Moreover, we might be working over a field which is not algebraically closed and hence simply "don't have" all the eigenvalues. This can be overcome, of course, by working in the algebraic closure of the field, but it complicates the explanation.

I'm looking for a proof that is simple and stand-alone as much as possible (the goal is writing an expository article about the subject, so clarity is the most important thing, not efficiency).

Gadi A
  • 19,265
  • 3
    How do you prove that two similar matrices have the same eigenvectors? – Manos Dec 02 '11 at 22:23
  • Take the matrix $A=diag(2,1)$. Then this is similar to $B=\left[\begin{array} 1 & 1 \ 0 & 1\end{array}\right] diag(2,1) \left[\begin{array} 1 & -1 \ 0 & 1\end{array}\right]=\left[\begin{array} 2 & -1 \ 0 & 1\end{array}\right]$. Now, $\left[\begin{array} 0 \ 1 \end{array}\right]$ is an eigenvector of $A$ but not of $B$. – Manos Dec 02 '11 at 22:29
  • sorry, i messed up with latex typing... – Manos Dec 02 '11 at 22:35
  • 2
    My point is that similar matrices do not have in general identical eigenvectors. – Manos Dec 02 '11 at 22:37
  • $A = [2 , , , 0; 0 , , , 1]$, $T = [1, , , 1; 0 , , , 1]$, $B=T A T^{-1}$. Check that $e_2=[0;1]$ is an eigenvector of $A$ but not of $B$. – Manos Dec 02 '11 at 22:38
  • 1
    Indeed, this is just plain wrong; what is correct is that two similar matrices can be viewed as representing the same linear transformation in different bases, and then their eigenvectors are "the same" in the sense that they are two representations (in the different bases) of the coordinates of the eigenvectors of the transformation. – Gadi A Dec 03 '11 at 12:52
  • In finite dimensions, similar matrices are isospectral. Their eigenvectors are not the same. A matrix and its eigenvectors mutually undergo a basis transformation simultaneously as you go to the similar matrix. – penovik Mar 06 '21 at 18:28
  • Is there maybe an elegant proof in the opposite way: that equality of characteristic polynomials imply existence of similarity matrix? – Jarek Duda Oct 08 '22 at 12:58

2 Answers2

73

If you define the characteristic polynomial of a matrix $A$ to be $\det(xI - A)$, then for $M$ invertible we have:

$\det(xI - M^{-1} A M)$

$= \det(M^{-1} xI M - M^{-1} A M)$

$= \det(M^{-1} (xI-A) M)$

$= \det (M^{-1}) \det(xI-A) \det(M)$

$=\det(xI - A)$

lhf
  • 216,483
  • That was simple. Very simple. Thank you! – Gadi A Dec 02 '11 at 11:03
  • 7
    This proof should be standard in any text, in order to even define the characteristic polynomial of a vector space endomorphism (as opposed to that of a matrix). Certainly you don't want to have to refer to eigenvalues and algebraic multiplicities in order to define the characteristic polynomial of an endomorphism. – Marc van Leeuwen Dec 02 '11 at 11:07
  • @Marc, how would you define determinants of general vector space endomorphisms? – hmakholm left over Monica Dec 02 '11 at 12:24
  • 3
    @Henning Makholm: Maybe my comment was not so clear. I would define the characteristic polynomial of a matrix in the usual way, then prove that it is invariant under similitude, which allows defining the characteristic polynomial of a vector space endomorphism as that of its matrix in any basis. One can define the determinant of general vector space endomorphisms without using bases, but I don't think that is very useful for characteristic polynomials, since there one needs a determinant over $K[X]$, not over a field. – Marc van Leeuwen Dec 02 '11 at 12:37
  • @Marc, then your remarks are implicitly limited to finite-dimensional vector spaces, or am I missing something? – hmakholm left over Monica Dec 02 '11 at 15:27
  • @Henning Makholm: Yes I was assuming finite dimensional, I thought that was clear in the context of characteristic polynomials. The point of (my use of) the term endomorphism is just that it doesn't require a basis. It's the same as linear transformation the OP mentions, except that it also indicates from a space to itself. Sorry if my terminology confused you. – Marc van Leeuwen Dec 02 '11 at 16:36
  • 3
    I don't understand where the $M^{-1}xIM$ comes from in the second step of the proof. – Robert S. Barnes Jun 24 '12 at 06:07
  • 3
    @RobertS.Barnes, $xM$ commutes with the identity matrix $I$. – lhf Jun 24 '12 at 12:11
  • 1
    So cool @lhf -- thanks :-) – User001 Nov 20 '15 at 05:29
  • @MarcvanLeeuwen Won't the simplest construction work? K naturally embeds in K[X], and any K-module can be enlarged to a K[X]-module (by taking the tensor product, I'm not sure), and an endomorphism A induces an endomorphism of this K[X]-module, and then the determinant of $Ix-A$ will be the characteristic polynomial. – lisyarus Oct 21 '16 at 17:30
  • Just a subtle point that I think might go unnoticed: We are talking about the equality of polynomials in $K[x]$ and not the polynomial functions. – Atom Oct 27 '22 at 06:09
1

(Too long for a comment.)

In lhf's proof, $x$ might be mistaken for a "general element of $R$." (I am taking matrices over a commutative ring $R$.)

Some of the things that are easy to overlook in their proof:

  1. Characteristic polynomial is not the "function polynomial", so the equality is really to be checked in $R[x]$. (Note that same polynomial functions can be induced from different polynomials.)

  2. But the proof above remains the same. This is because you can again "pull out" $M^{-1}$ and $M$ in $xI - M^{-1}AM$ but this time, in $R^{n\times n}[X]$. (I use $X$, and not $x$, deliberately, for clarity.) But this needs care.

  3. Commutativity is required so that the determinant has the usual properties.

Atom
  • 3,905