6

How can I show that $\mathrm{adj} (AB) = \mathrm{adj}(B)\ \mathrm{adj}(A)$? It is obvious if determinants are non-zero, but if any of the matrices are singular, I just don't get it.

UPD. I've just tried to simply expand (i,j)-th element in both sides, but it seems to be (unnecessarily) complicated and I couldn't figure out the proof.

Akiiino
  • 466
  • 2
    The singular case follows automatically, since it's a polynomial identity in the matrix entries. See for example Bill Dubuque's answer here for similar situations: http://math.stackexchange.com/a/162978/1242. – Hans Lundmark Apr 20 '16 at 09:29
  • @HansLundmark I've seen such approach on some website, though I don't understand how this works. For example, I could use the same approach to show that $AB$ is similar to $BA$ for all $A, B$: $BA = A^{-1}(AB)A$. Though if you look at the example $A = \begin{pmatrix}1&0\0&0\end{pmatrix}, B = \begin{pmatrix}0&1\0&0\end{pmatrix}$, then $AB = \begin{pmatrix}0&1\0&0\end{pmatrix}, BA = \begin{pmatrix}0&0\0&0\end{pmatrix}$ and those are definitely not similar! – Akiiino Apr 20 '16 at 09:45
  • Well, the statement "there exists an invertible matrix $X$ such that $BA=X^{-1}ABX$" is not a polynomial identity in the entries of $A$ and $B$... – Hans Lundmark Apr 20 '16 at 11:19

1 Answers1

4

Touting my own horn, but this is Exercise 6.33 in my Notes on the combinatorial fundamentals of algebra. (The numbering might change in the future; the above refers to the numbering used in the frozen version of 10 January 2019.) The solution I gives makes no assumptions on the base ring and uses no "polynomial tricks".

That said, you should easily be able to find this proof with one hint: the Cauchy-Binet formula.

Nevertheless, you should learn the "polynomial trick" in one of its many forms (or, better, in several). A good source to start is Keith Conrad's Universal Identities. For this specific problem, however, an even simpler version of the trick suffices: Let $R$ be the commutative ring with identity over which your matrices are defined. Consider the matrices $A + XI_n$ and $B + XI_n$ over the polynomial ring $R\left[X\right]$. These two matrices are not necessarily invertible, but they are "almost as good": they can be cancelled from equalities! Indeed, $\det\left(A + XI_n\right)$ is a monic polynomial (of degree $n$), and so it can be cancelled (e.g., if two matrices $U$ and $V$ over $R\left[X\right]$ satisfy $\det\left(A + XI_n\right) U = \det\left(A + XI_n\right) V$, then $U = V$). Therefore, the matrix $A + XI_n$ itself can also be cancelled (since its multiple $\left(A + XI_n\right) \operatorname{adj}\left(A + XI_n\right) = \det\left(A + XI_n\right) I_n$ can be cancelled). Similarly, the matrix $B + XI_n$ can be cancelled. Now, you can take your method that requires $\det A$ and $\det B$ to be cancellable (I bet this is what it requires; not invertibility), and apply it to $A + XI_n$ and $B + XI_n$. Thus, you get $\operatorname{adj}\left(\left(A + XI_n\right) \left(B + XI_n\right)\right) = \operatorname{adj}\left(B + XI_n\right) \operatorname{adj}\left(A + XI_n\right)$. Now, substitute $0$ for $X$, and watch all $X$'s disappear. In more details, this proof of $\operatorname{adj}\left(AB\right) = \operatorname{adj} B \operatorname{adj} A$ can be found in the proof of Theorem 5.10 in my The trace Cayley-Hamilton theorem.