1

The eigenvalues of $AB$ and of $BA$ are identical for all square $A$ and $B$. I have done the proof in a easy way…

If $ABv = λv$, then $B Aw = λw$, where $w = B v$. Thus, as long as $w \neq 0$, it is an eigen vector of $B A$ with eigenvalue $λ$ and the other case when $w = 0$ can be easily handled.

But I have seen the proof the same result in a book ~"H. Wilkinson, The Algebraic Eigenvalue Problem, Clarendon Press, Oxford, 1965, " on page 54 in a different way where I am unable to understand how they did the matrix construction during starting the problem. that is how they got the idea of the matrices $$\begin{bmatrix}I&O\\-B&\mu I\end{bmatrix}$$

and $$\begin{bmatrix}\mu I&A\\B&\mu I\end{bmatrix}$$

For ready refference I am attaching a screenshot of the proof as follows:

Eigenvalues of $AB$

  1. Notice, however, that the eigenvalues of $AB$ and of $BA$ are identical for all square $A$ and $B$. The proof is as follows. We have $$\begin{bmatrix}I&O\\-B&\mu I\end{bmatrix}\begin{bmatrix}\mu I&A\\B&\mu I\end{bmatrix}=\begin{bmatrix}\mu I&A\\O&\mu^2I-BA\end{bmatrix}\tag{51.1}$$ and $$\begin{bmatrix}\mu I&A\\O&I\end{bmatrix}\begin{bmatrix}\mu I&A\\B&\mu I\end{bmatrix}=\begin{bmatrix}\mu^2I-BA&O\\B&\mu I\end{bmatrix}.\tag{51.2}$$

    Taking determinants of both sides of $(51.1)$ and $(51.2)$ and writing $$\begin{bmatrix}\mu I&A\\B&\mu I\end{bmatrix}=X,\tag{51.3}$$ we have $$\mu^n\det(X)=\mu^n\det(\mu^2I-BA)\tag{51.4}$$ and $$\mu^n\det(X)=\mu^n\det(\mu^2I-AB).\tag{51.5}$$

    Equations $(51.4)$ and $(51.5)$ are identities in $\mu^2$ and writing $\mu^2=\lambda$ we have $$\det(\lambda I-BA)=\det(\lambda I-AB),\tag{51.6}$$ showing that $AB$ and $BA$ have the same eigenvalues.

User8976
  • 12,637
  • 9
  • 42
  • 107

1 Answers1

0

The key is to realize that you multiply partitioned matrices together in exactly the same manner by you do "regular" matrix matrix multiplication. The only difference is that multiplication is no longer commutative because your are dealing with submatrices rather than scalars. I shall demonstrate. Let $A$ and $B$ be partitioned conformally, i.e. \begin{equation} A = \begin{pmatrix} A_{11} & A_{12} \\ A_{21} & A_{22} \end{pmatrix}, \quad B = \begin{pmatrix} B_{11} & B_{12} \\ B_{21} & B_{22} \end{pmatrix} \end{equation} where $A_{ij}$ and $B_{ij}$ are matrices in their own right. Conformally means that all the subsequent products are defined. Then \begin{equation} A B = \begin{pmatrix} A_{11} B_{11} + A_{12} B_{21} & A_{11} B_{12} + A_{12} B_{22} \\ A_{21} B_{11} + A_{22} B_{21} & A_{21} B_{12} + A_{22} B_{22} \end{pmatrix} \end{equation} This is how equations (51.1) and (51.2) were derived.

Carl Christian
  • 12,583
  • 1
  • 14
  • 37