1

The usual proof I see of this statement is this:

Starting from $Ax=λx$, we have

$ABx=BAx=Bλx=λBx$

Thus $x$ and $Bx$ are both eigenvectors of $A$, sharing the same $λ$ (or else $Bx=0$). If we assume for convenience that the eigenvalues of $A$ are distinct – the eigenspaces are one dimensional – then $Bx$ must be a multiple of $x$. In other words x is an eigenvector of $B$ as well as $A$.

But I don't follow this, because the eigenvalue for $Bx$ is still $\lambda$ above. If we have that $Bx = \frac{\lambda_2}{\lambda} x$, that just means that $ABx = \lambda_2x$, which isn't even the eigenvalue equation. So is the above statement even true?

Striker
  • 803

1 Answers1

2

The vectors $x$ and $Bx$ both belong to the eigenspace of $A$ relative to $\lambda$. Since this is one-dimensional, $x$ makes a basis for it, so certainly $$ Bx=\mu x $$ for some $\mu$. Therefore $x$ is an eigenvector for $B$.

The statement doesn't say that $A$ and $B$ have the same eigenvalues. It says that the eigenvectors of $A$ are also eigenvectors for $B$.

The eigenvalues of $A$ and $B$ can actually be different. Consider the trivial case where $B$ is the identity matrix and $$ A=\begin{bmatrix} 2 & 0 \\ 0 & 3 \end{bmatrix} $$ Then $AB=BA$ and (obviously) every eigenvector of $A$ is an eigenvector of $B$; the converse is not true, in this case, because every nonzero vector in $\mathbb{R}^2$ is an eigenvector for the identity matrix, which is not the case for $A$.

egreg
  • 238,574
  • Thanks! I'd hadn't thought of this in terms of the multiplicity of the eigenvalues (and so dimensionality of the eigenspace). That makes a lot of sense. – Striker Oct 06 '18 at 22:59