5

I'm seeking for a proof of the following:

Let $A$ be an invertible matrix. Then the determinant of $A^{-1}$ equals: $$\left|A^{-1}\right|=|A|^{-1} $$

I don't know where to begin the proof. Any suggestions?

Tolaso
  • 6,656

3 Answers3

8

Hint: You know that $\det(AB)=\det(A)\det(B)$ for all $n\times n$ matrices $A,B$. So make a judicious choice for $B$...

Casteels
  • 11,292
  • 4
  • 27
  • 38
5

Exploiting the commutativity of the determinant operation with multiplication is probably the easiest way. That said, here is another approach.

The determinant of a square matrix is equal to the product of its eigenvalues.

Now note that for an invertible matrix $\mathbf A$, $\lambda\in\mathbb R$ is an eigenvalue of $\mathbf A$ is and only if $1/\lambda$ is an eigenvalue of $\mathbf A^{-1}$. To see this, let $\lambda\in\mathbb R$ be an eigenvalue of $\mathbf A$ and $\mathbf x$ a corresponding eigenvector. Then, \begin{align*} \mathbf A\mathbf x=&\,\lambda\mathbf x\\ \Longrightarrow\qquad{\phantom{\lambda^{-1}}}\mathbf x=&\,\lambda\mathbf A^{-1}\mathbf x\\ \Longrightarrow\qquad\mathbf \lambda^{-1}\mathbf x=&\,\mathbf A^{-1}\mathbf x. \end{align*} That is, $\lambda^{-1}$ is an eigenvalue of $\mathbf A^{-1}$ corresponding to the same eigenvector $\mathbf x$. The other direction is analogous.

Hence, the determinant of $\mathbf A^{-1}$ is equal to the product of the eigenvalues of $\mathbf A^{-1}$, which is the product of the reciprocals of the eigenvalues of $\mathbf A$, which is just the reciprocal of the determinant of $\mathbf A$.

triple_sec
  • 23,377
  • But this method only proved the geometric multiplicity is same, but we need algebraic multiplicity same right? – wz0919 Oct 18 '20 at 06:43
  • @wz0919 “The determinant of a square matrix is equal to the product of its eigenvalues.” For this statement to be literally true, one needs to “split” each eigenvalue with an algebraic multiplicity $>$$1$ into repeated distinct eigenvalues. – triple_sec Oct 18 '20 at 18:36
  • Thanks for answering, but I'm still confused: If $\lambda_1 = \lambda_2$, first we use $\lambda_1$ to get $\lambda_1^{-1}$ is an eigenvalue of $A^{-1}$, that's right, but if we use $\lambda_2$ further, we've already have $\lambda_2^{-1} = \lambda_1^{-1}$ is an eigenvalue of $A^{-1}$ and nothing new is proved. I can't understand why '$\lambda_1^{-1}$ is an eigenvalue of $A^{-1}$' is different from '$\lambda_2^{-1}$ is an eigenvalue of $A^{-1}$'.. – wz0919 Oct 19 '20 at 01:23
  • @wz0919 In this case, both $\lambda_1^{-1}$ and $\lambda_2^{-1}$ are repeated eigenvalues of $\mathbf A^{-1}$. – triple_sec Oct 19 '20 at 06:28
  • I think it's clearer using characteristic polynomial. We have: for any eigenvalue $ \lambda $ of $ A $, $ \lambda^{-1} $ is an eigenvalue of $ A^{-1} $. This is equivalent to: for any root $ \lambda $ of $ p_A(\nu) $(characteristic polynomial of $ A $), $ \lambda^{-1} $ is a root of $ p_{A^T}(\nu) $(characteristic polynomial of $ A^T $). We need to prove $ 1/p_{A^T}(0) = p_A(0)$, so besides the match of roots, we still need the multiplicity of roots being same. – wz0919 Oct 19 '20 at 06:50
2

By interpreting the determinant as the (signed) ratio between the hypervolume of $f_A(\Gamma)$ and the hypervolume of $\Gamma$, where $\Gamma$ is a symplex associated with the canonical base and $f_A$ is the linear map associated with $A$, the claim is trivial, since: $$ \left(f_A\right)^{-1} = f_{A^{-1}}. $$ This is just the classic "measure-theoretic" proof of Binet's theorem.

Jack D'Aurizio
  • 353,855