5

Let $\mathbf{A}$ and $\mathbf{B}$ be $k \times k$ matrices and $\mathbf{M}$ is the block matrix $$\mathbf{M} = \begin{pmatrix}0 & \mathbf{B} \\ \mathbf{A} & 0\end{pmatrix}.$$ How to prove that $\det(\mathbf{M}) = (-1)^k \det(\mathbf{A}) \det(\mathbf{B})$?

Julien
  • 44,791

5 Answers5

6

Here is one way among others: $$ \left( \matrix{0&B\\A&0}\right)=\left( \matrix{0&I_k\\I_k&0}\right)\left( \matrix{A&0\\0&B}\right). $$ I assume you are allowed to use the block diagonal case, which gives $\det A\cdot\det B$ for the matrix on the far right. Just in case, this follows for instance from Leibniz formula.

Now it boils down to $$ \det\left( \matrix{0&I_k\\I_k&0}\right)=(-1)^k. $$ This is the matrix of a permutation made of $k$ transpositions. So the determinant is $(-1)^k$ after $k$ elementary operations (transpositions in this case) leading to the identity matrix.

Julien
  • 44,791
2

Note that by performing $k$ columns swaps you can obtain the matrix $$\mathbf{M}^{'} = \begin{pmatrix}\mathbf{B} & 0 \\ 0 & \mathbf{A}\end{pmatrix}.$$ Obviously $det(\mathbf{M}^{'})=(-1)^kdet(\mathbf{M})$.

Now, performing elementary operations on rows (i.e. swithching two rows, multiplying selected row by a non-zero scalar, or adding to a selected row another one multiplyied by a scalar) of $\mathbf{M}^{'}$, reduce it to the form
$$\mathbf{M}^{''} = \begin{pmatrix}\mathbf{B}^{'} & 0 \\ 0 & \mathbf{A}^{'}\end{pmatrix},$$ where $\mathbf{B}^{'}$ and $\mathbf{A}^{'}$ are upper-triangular matrices. Note that $\mathbf{M}^{''}$ is upper-triangular itself with a diagonal $[b_1,\ldots,b_k,a_1\ldots,a_k]$.

Obviously $det(\mathbf{M}^{'})=C_1C_1b_1\ldots b_k a_1\ldots a_k$ where $C_1$ is a number generated by the operations on the first $k$ rows, and $C_2$ is a number generated by the operations on the rows $k+1,\ldots k+k$. It is clear that $det(\mathbf{B})=C_1b_1\ldots b_k$ and $\det(\mathbf{A})=C_2a_1\ldots a_k$.

Thus $det(\mathbf{M})=(-1)^kC_1C_1b_1\ldots b_k a_1\ldots a_k=(-1)^kdet(\mathbf{A})det(\mathbf{B})$.

Godot
  • 2,082
2

I think julien has answered your question in a beautiful way. Here I will put your question in a slightly more general context. Note that for a partitioned matrix $$ M=\pmatrix{P&Q\\ R&S}, $$ where the four submatrices are square and have the same size, if $R$ commutes with $S$ (i.e. $RS=SR$), we have the block determinant formula $$\det M=\det(PS-QR)\tag{1}$$ that is analogous to the determinant formula for $2\times2$ matrices. So, if we put $P=S=0,\ Q=A$ and $R=B$, we get $\det M=\det(-AB)=(-1)^k\det(AB)=(-1)^k\det(A)\det(B)$.

Certainly you shouldn't do your exercise using this formula, because an exercise like yours are often intended to be solved in a more elementary way, and $(1)$ is more difficult to prove than your problem statement. Nonetheless it's good to know some nice formulae. Yet formula $(1)$ should be applied with care: in general, we have $$ \det M= \begin{cases} \det(PS-QR) & \text{ if } RS=SR,\\ \det(SP-RQ) & \text{ if } PQ=QP,\\ \det(SP-QR) & \text{ if } QS=SQ,\\ \det(PS-RQ) & \text{ if } PR=RP. \end{cases}\tag{2} $$ Since the four determinants $\det(PS-QR),\,\det(SP-RQ),\,\det(SP-QR)$ and $\det(PS-RQ)$ are in general different, the pair of submatrices that commute as well as the order of the four blocks do matter.

user1551
  • 139,064
0

I would use the method that is used in Hoffmann and Kunze, in chapter 5. Namely define a function $D(A, B) = det(M)$ (we are roaming along possible $A, B$). Show $D$ is a "determinant function." in $A$ and $B$ (alternating and $n$-linear, etc). Then iteratively use that for a determinant function $D(A) =$ det$(A)D(I)$.

AlexM
  • 931
  • 1
  • 5
  • 6
0

It'd probably help if you expanded it out a little. We can safely say $$det(M)=(0)(0)-AB$$ So you just have to show what -AB is. $$\begin{bmatrix}a_{11} & ... & a_{1k}\\ \vdots & & \vdots \\ a_{k1} & ... & a_{kk}\end{bmatrix}\begin{bmatrix}b_{11} & ... & b_{1k}\\ \vdots & & \vdots\\b_{k1}& ...& b_{kk}\end{bmatrix}$$ Start multiplying it out and see if you start to see a pattern. Remember, you can work in both directions of the proof. So you can start from the $(-1)^{k}det{A}det{B}$ side or det(M) side. If the two meet up at some point then you are done (I usually rewrite it to flow better though).