3

Suppose $\mathbf A$ and $\mathbf B$ are both real, symmetric matrices. Suppose $\mathbf B$ has an eigenvalue $\lambda$ corresponding to eigenvector $\mathbf v$. Suppose that $\mathbf A$ and $\mathbf B$ commute. Therefore $$ \mathbf {BAv} = \mathbf {ABv} = \mathbf {A} \lambda \mathbf v = \lambda \mathbf {Av}$$ Therefore $\mathbf {Av}$ is an eigenvector of $\mathbf B$.

But this isn't good enough. For example, maybe (the transformation associated with) $\mathbf A$ maps $\mathbf v$ to some subspace orthogonal to $\mathbf v$. In that case $\mathbf {Av}$ is still an eigenvector of $\mathbf B$, but $\mathbf A$ does not simply scale $\mathbf v$ so $\mathbf v$ is therefore not an eigenvalue of $\mathbf A$.

What is missing from my proof to show that $\mathbf v$ is an eigenvector of $\mathbf A$ ?

Thomas Andrews
  • 177,126
EricVonB
  • 397

1 Answers1

3

You cannot show it because it is not true. Consider

$$ A = \operatorname{diag}(1,2,3), B = \operatorname{diag}(4,4,5). $$

Then $e_1 + e_2$ is an eigenvector of $B$ associated to the eigenvalue $4$ but $e_1 + e_2$ is not an eigenvector of $A$.

The point is that if $A,B$ are simulteneously diagonalizable (as happens in the above example), it means that one can find some basis $(v_1,\dots,v_n)$ of vectors that consists of eigenvectors both for $A$ and $B$. It does not mean that any basis of eigenvectors of $B$ is a basis of eigenvectors for $A$.

levap
  • 65,634
  • 5
  • 79
  • 122
  • Thanks for the answer, it is enlightening. Sorry to shift the goalposts now, but I wish to restrict the question to the case where the $n \times n$ matrices $\mathbf A$ & $\mathbf B$ each has $n$ distinct eigenvalues. With this new constraint, can I conclude that the commuting matrices $\mathbf A$ & $\mathbf B$ share the same eigenvectors? Are $\mathbf A$ & $\mathbf B$ now infact identical to each other? – EricVonB Dec 30 '16 at 01:29
  • Yes, now you can conclude that they share the same eigenvectors but not neccesarily the same eigenvalues so they don't have to be identical. You have shown that if $Bv = \lambda v$ (with $v \neq 0$) then $B(Av) = \lambda (Av)$. Now you have two options: If $Av = 0$ then $v$ is clearly an eigenvector of $A$ (associated to the eigenvalue $0$). If $Av \neq 0$ then both $v$ and $Av$ are eigenvectors of $B$ associated to the eigenvalue $\lambda$ so they must be linearly dependent and we can write $Av = \mu v$ for some $\mu$ which again implies that $v$ is an eigenvector of $A$. – levap Dec 30 '16 at 01:34