1

$A$ is an $m\times n$ matrix and $B$ is an $n\times m$ matrix. $$ \det(I_m + AB) = \det(I_n + BA) $$ Solution: I found this guys: $$ \det\begin{pmatrix}I&-B\\\\A&I\end{pmatrix} \det\begin{pmatrix}I&B\\\\0&I\end{pmatrix} =\det\begin{pmatrix}I&-B\\\\A&I\end{pmatrix}\begin{pmatrix}I&B\\\\0&I\end{pmatrix} =\det\begin{pmatrix}I&0\\\\A&AB+I\end{pmatrix} =\det(I+AB) $$

and

$$ \det\begin{pmatrix}I&B\\\\0&I\end{pmatrix} \det\begin{pmatrix}I&-B\\\\A&I\end{pmatrix} =\det\begin{pmatrix}I&B\\\\0&I\end{pmatrix} \begin{pmatrix}I&-B\\\\A&I\end{pmatrix} =\det\begin{pmatrix}I+BA&0\\\\A&I\end{pmatrix} =\det(I+BA) $$

Source: Sylvester's determinant identity

  • 4
    @RobertIsrael: This doesn't look like a duplicate to me. – TonyK Jan 23 '20 at 20:23
  • 1
    Though it's certainly related, and might lead to a conjecture such as: if $m < n$, then the characteristic polynomial of $BA$ is $\lambda^{n - m}$ times the characteristic polynomial of $AB$. Then, you might even be able to prove such a conjecture by using the "well, it's true in a generic case where $AB$ has all eigenvalues of multiplicity 1, and the coefficients of both sides are polynomials in $a_{ij}, b_{ij}$, so..." type of argument. – Daniel Schepler Jan 23 '20 at 20:31
  • 2
    Quite a standard proof on wiki – A.Γ. Jan 23 '20 at 20:38
  • 1
    Since the $I$ on the left side and the $I$ on the right side are different matrices, it would be a good idea to write $I_m$ and $I_n$ or similar. – celtschk Jan 23 '20 at 20:39
  • Closely related : https://math.stackexchange.com/questions/311342/do-ab-and-ba-have-same-minimal-and-characteristic-polynomials – Arnaud D. Jan 23 '20 at 20:44
  • Why does $\det\begin{pmatrix}I&0\\A&AB+I\end{pmatrix} =\det(I+AB)$? – lmngn23 May 09 '20 at 19:01

1 Answers1

0

If $B$ is (square and) invertible, then \begin{align*} \det (1 + AB) = \det \left[B^{-1}(1 + BA)B\right] = (\det B^{-1}) \det(1 + BA)\det B) = \det (1 + BA). \end{align*} But $f(A, B) = \det (1 + AB) - \det (1 + BA)$ is a polynomial in the entries $A_{ij}$ and $B_{ij}$, and the space of invertible $B$ is dense (as if $B$ is not invertible, then $B + \epsilon $ is invertible for small $\epsilon > 0$). Hence $f(A, B)$ vanishes everywhere, as required. For the arbitrary case, pad $A$ and $B$ with $0$s and reduce to the square case.

(That's assuming that you're working over a ground field like $\mathbb{R}$ or $\mathbb{C}$. To make the argument work in completely generality, you'll either need to be more precise about the nature of the "dense set," embed $A$ and $B$ in a larger $GL_n(k)$ as in the wiki link above, explicitly work with something like a $QR$-decomposition of $A$ and $B$, or etc.)

anomaly
  • 25,364
  • 5
  • 44
  • 85
  • 2
    When $m\ne n$, when is an $n\times m$ matrix invertible? – celtschk Jan 23 '20 at 20:42
  • @celtschk: Never, of course, but you can pad $A$ and $B$ out with $0$s and apply the same argument. (I could have sworn there was a similar or identical question posted a while back, but I can't seem to find it.) – anomaly Jan 23 '20 at 20:43
  • 2
    Another way to make the argument work in complete generality is to observe that both sides are defined for $A \in M_{m\times n}(R), B \in M_{n\times m}(R)$ for any commutative ring $R$, and they're functorial. So, from truth for $R = \mathbb{C}$, you can conclude truth in $R = \mathbb{Z}[a_{ij}, b_{ij}]$, from which truth for $R$ any commutative ring follows. – Daniel Schepler Jan 23 '20 at 20:50
  • Hmm, still can't seem to find it. This is the closest match I could find: https://math.stackexchange.com/questions/17831/sylvesters-determinant-identity . The idea in short is that it's easy to prove for invertible (and thus square) matrices; then for square matrices by continuity; then for arbitrary matrices by padding with $0$s and being more careful than I am here. – anomaly Jan 23 '20 at 20:51
  • @DanielSchepler: That's probably the cleanest way of doing it. – anomaly Jan 23 '20 at 20:53