12

Show that if $A,B \in M_{n \times n}(K)$, where $K=\mathbb{R}, \mathbb{C}$, then the matrices $AB$ and $BA$ have same eigenvalues.

I do that like this:

let $\lambda$ be the eigenvalue of $B$ and $v\neq 0$

$ABv=A\lambda v=\lambda Av=BAv$

the third equation is valid, because $Av$ is the eigenvector of $B$. Am I doing it right?

user26857
  • 52,094
Mateusz
  • 852
  • 2
    It looks correct and the right approach. – DonAntonio Jun 05 '14 at 16:43
  • 3
    @DonAntonio The proof isn't correct. –  Jun 05 '14 at 16:47
  • 1
    I can't see any proof, leave alone it is correct or not, @user63181. Yet I can see now that I misread and the OP was attempting something that doesn't seem to help him to prove what he asked. – DonAntonio Jun 05 '14 at 16:49
  • I think the OP was attempting the argument given here http://math.stackexchange.com/a/124903/49610

    As for showing that $AB$ and $BA$ have the same characteristic polynomial, there's a beautiful, purely algebraic proof, in an answer I can't find for the life of me, using the fact that $$ \begin{pmatrix}AB-t & 0 \ 0 & -t\end{pmatrix}, \quad \begin{pmatrix}BA-t & 0 \ 0 & -t \end{pmatrix} $$ can be cleverly factored as products of the same two matrices.

    – Branimir Ćaćić Jun 05 '14 at 17:17
  • Also have a look at the following interesting blog post by Qiaochu Yuan:http://qchu.wordpress.com/2012/06/05/ab-ba-and-the-spectrum/#comment-4230 – user50948 Jun 05 '14 at 19:18
  • A closely related question (where it is assumed that $B$ is invertible - otherwise it would be a duplicate). – Jyrki Lahtonen Jun 05 '14 at 21:07

6 Answers6

23

Here is a proof similar to what the OP has tried:

Let $\lambda$ be any eigenvalue of $AB$ with corresponding eigenvector $x$. Then

$$ABx = \lambda x \Rightarrow \\ BABx = B\lambda x \Rightarrow\\ BA(Bx) = \lambda (Bx) $$

which implies that $\lambda$ is an eigenvalue of $BA$ with a corresponding eigenvector $Bx$, provided $Bx$ is non-zero. If $Bx = 0$, then $ABx = 0$ implies that $\lambda = 0$.

Thus, $AB$ and $BA$ have the same non-zero eigenvalues.

M. Vinay
  • 9,004
  • Can something similar be said about singular-values? Update: https://math.stackexchange.com/questions/2448088/singular-values-of-ab-and-ba-matrices – LBogaardt Sep 27 '17 at 20:08
  • Your proof has no interest because you don't show the equality of multiplicities of the eigenvalues of $AB,BA$. Moreover I don't see why the eigenvalue $0$ is a problem: if $\det(AB)=0$, then $\det(BA)=0$ too. –  Apr 13 '18 at 09:40
  • @loupblanc My primary aim was to give a proof similar to the one OP attempted. About the zero eigenvalue problem, yes, if $A$ and $B$ are square matrices, that's true. But the proof I've shown is more general and works even if $A$ and $B$ are rectangular such that $AB$ and therefore $BA$ are square. – M. Vinay Apr 14 '18 at 11:57
  • Remark that since the OP gave the green chevron to a complete solution, it's because he (she) understood that he was on a wrong path. Note also that to show the complete version of the result is natural because, generically, $AB$ and $BA$ are similar. The purpose of my post was not to annoy you but rather to point out that in the problems where this result is used, one always needs to consider the full spectrum (with multiplicity). Then, I propose that those who upvoted your post should not be allowed to use the considered result in its complete form. –  Apr 14 '18 at 13:01
  • Let $A=\begin{bmatrix}0&1\0&0\end{bmatrix}$, $B=\begin{bmatrix}0&0\0&1\end{bmatrix}$. Then $AB = A \ne 0$, but $BA = 0$. So $AB$ and $BA$ are not always similar. Nevertheless, you are right that they will always have the same spectrum. I will attempt to update my answer with a proof of this (one that has not been given in any of the other answers). I think the best proof is Alternative proof #2 in https://math.stackexchange.com/a/822198/152030. – M. Vinay Apr 15 '18 at 04:07
  • Your $A,B$ are not generic and this counter-example is well-known.. "property $P$ is generically true " means (roughly speaking) that if the variables $A,B$ are randomly chosen, then the property $P(A,B)$ is true with probability $1$. –  Apr 15 '18 at 09:11
  • @loupblanc Ah, I did not know that terminology. Thank you. So it's like "for almost all $A$ and $B$", but probabilistically. – M. Vinay Apr 16 '18 at 07:12
  • More generally, one uses in algebraic geometry the notion of Zariski closed set (small sets defined by algebraic relations). Here if $A$ is invertible, then $AB,BA$ are similar; The non-invertible $A$ are in the "small set" defined by the algebraic relation $\det(Z)=0$. Then $P(A,B)$ is true except if the variables are in some Zariski closed set (small set of $0$-measure). –  Apr 16 '18 at 10:55
16

It suffices to show that $AB$ and $BA$ have the same characteristic polynomial. First assume that $A$ is invertible then

$$\chi_{AB}(x)=\det(AB-xI)=\det A\det(B-xA^{-1})\\=\det(B-xA^{-1})\det A=\det(BA-xI)=\chi_{BA}(x)$$ Now since $\operatorname{GL}_n(K)$ is dense in $\operatorname{M}_n(K)$ then there's a sequence of invertible matrices $(A_n)$ convergent to $A$ and by the continuity of the $\det$ function we have $$\chi_{AB}(x)=\det(AB-xI)=\lim_{n\to\infty}\det(A_nB-xI)=\lim_{n\to\infty}\det(BA_n-xI)\\=\det(BA-xI)=\chi_{BA}(x).$$

  • 11
    While this proof works, I think there is a little bit too much machinery at play here, as the original problem is probably intended for a student who has just started learning about eigenvalues. – Christopher A. Wong Jun 05 '14 at 21:13
12

Alternative proof #1:

If $n\times n$ matrices $X$ and $Y$ are such that $\mathrm{tr}(X^k)=\mathrm{tr}(Y^k)$ for $k=1,\ldots,n$, then $X$ and $Y$ have the same eigenvalues.

See, e.g., this question.

Using $\mathrm{tr}(UV)=\mathrm{tr}(VU)$, it is easy to see that $$ \mathrm{tr}[(AB)^k]=\mathrm{tr}(\underbrace{ABAB\cdots AB}_{\text{$k$-times}}) =\mathrm{tr}(\underbrace{BABA\cdots BA}_{\text{$k$-times}})=\mathrm{tr}[(BA)^k]. $$ Now use the above with $X=AB$ and $Y=BA$.

Alternative proof #2:

$$ \begin{bmatrix} I & A \\ 0 & I \end{bmatrix}^{-1} \color{red}{\begin{bmatrix} AB & 0 \\ B & 0 \end{bmatrix}} \begin{bmatrix} I & A \\ 0 & I \end{bmatrix} = \color{blue}{\begin{bmatrix} 0 & 0 \\ B & BA \end{bmatrix}}. $$ Since the $\color{blue}{\text{red matrix}}$ and the $\color{red}{\text{blue matrix}}$ are similar, they have the same eigenvalues. Since both are block triangular, their eigenvalues are the eigenvalues of the diagonal blocks.

6

If $A$ is invertible, user63181 already showed that

$$\det(AB-xI)=\det(BA-xI)$$

We now prove this equality in general.

Fix $x$

Let $$P_x(y):=\det[(A-yI)B-xI]-\det[B(A-yI)-xI] \,.$$

Then $P_x(y)$ is a polynomial of degree at most $n$ in $y$. Whenever $y$ is not an eigenvalue of $A$ the matrix $A-yI$ is invertible, thus by the first pat $P_x(y)=0$. Hence $P_x$ has infinitely many roots, and hence $P_x \equiv 0$.

This proves that $P_x(0)=0$ which is exactly what you need to prove.

N. S.
  • 132,525
5

Here is a more "algebraic" approach from the other answers by user63181 and N. S. which, as far as I can see generalizes to other fields (where the continuity argument might fail) although I thought the argument by continuity was cool!

First we use the following characterization of non-zero eigenvalues: $\lambda$ is an eigenvalue for $AB$ iff $I - \lambda AB$ is not invertible.

I claim the following: $I - \lambda AB$ is (not) invertible iff $I - \lambda BA$ is (not) invertible.

Proof: Suppose $I - \lambda AB$ is invertible, then let $U := 1 + \lambda B (I - \lambda AB)^{-1}A$. Now show that $U$ is an inverse to $I - \lambda BA$ (by multiplying out and using distributivity). The converse direction follows by letting $V := 1 + \lambda A(1-\lambda BA)^{-1}B$.

From this it follows that $AB$ and $BA$ have the same non-zero eigenvalues.

user50948
  • 1,439
  • Your proof has no interest because you don't show the equality of multiplicities of the eigenvalues of $AB,BA$. Moreover I don't see why the eigenvalue $0$ is a problem: if $\det(AB)=0$, then $\det(BA)=0$ too. –  Apr 13 '18 at 09:41
0

Observe that there is some $v$ such that $ABv=\lambda v$ iff there are $v,w$ that solve the linear system: $$\begin{cases}Aw=\lambda v, \\Bv=w.\end{cases}\tag1$$ On the other hand, from (1), if $\lambda\neq0$, one also immediately gets $BAw=\lambda w$.

It follows that $ABv=\lambda v$ for some $\lambda\neq0$ iff $BAw=\lambda w$, where $w=Bv$ (and thus also $v=\frac1\lambda Aw$).

Example

For example, consider $$A\equiv \begin{pmatrix}1&0\\0&0\end{pmatrix}, \qquad B\equiv \frac12\begin{pmatrix}1&1\\1&1\end{pmatrix}.$$ Then $$ AB = \frac12\begin{pmatrix}1&1\\0&0\end{pmatrix}, \qquad BA = \frac12\begin{pmatrix}1&0\\1&0\end{pmatrix}. $$ Thus $AB e_1=\frac12 e_1$ and $AB(e_1-e_2)=0$, while $BA(e_1+e_2)=\frac12(e_1+e_2)$ and $BA e_2=0$. And we can notice how the nonzero eigenvectors of $AB$ are related as spelled out before: $Be_1= \frac12(e_1+e_2)$, and $A(e_1+e_2)=e_1$.

On the other hand, this same example shows how the result fails for zero eigenvalues. It remains true that both matrices have kers of the same dimensions, but the correponding eigenvectors are not related as in the nonzero eigenvalues case. Here the ker of $AB$ is spanned by $e_1-e_2$, but $B(e_1-e_2)=0$, which thus clearly does not give the ker of $BA$, which is spanned by $e_2$.

I'd also observe here that the reason we don't get the stated relation between eigenvectors of $AB$ and $BA$ is that we can have situations where $ABv=0$ because $Bv=0$ (which is also what happens in the example). If however we have $ABv=0$ but $Bv\neq0$, then we can see that we recover (at least partially) the result, because $(BA)(Bv)=0$.

glS
  • 6,818