4

I know that $A\cdot\mathrm{adj}(A) = \det(A) \cdot I$, but why $\mathrm{adj}(A)\cdot A = A\cdot\mathrm{adj}(A)$?

Stabilo
  • 1,414
  • 2
    also use that identity for the transposed $A^t$ of $A$ (e.g. $A^t\cdot\adj\left(A^t\right)=\det\left(A^t\right)\cdot I$) together with $\left(det\left(A\right)\cdot I\right)^{t}=\det\left(A\right)\cdot I$ and $\det\left(A^{t}\right)=det\left(A\right)$ – Max Oct 02 '15 at 09:19
  • @Max I think this should be written up into a full answer. Many of the other answers use density to make the reduction, but this just uses the more intuitive idea that adj commutes with transposition. – Erick Wong Oct 09 '15 at 14:43
  • will do that later, then. – Max Oct 09 '15 at 16:20
  • @Max please do, thanks – Stabilo Oct 10 '15 at 10:44
  • @user3697301 done. – Max Oct 10 '15 at 11:26

5 Answers5

4

I was asked to turn my comment above into an answer, so here we go:

First note that you already know \begin{equation} \det\left(A^T\right)\cdot I=A^T\cdot \mathrm{adj}\left(A^T\right), \end{equation} too.

Now use \begin{equation} \det\left(A\right)=\det\left(A^T\right) \end{equation} (you get that by using the formula of $\det$ based on permutations) and \begin{equation} \mathrm{adj}\left(A^T\right)=\mathrm{adj}\left(A\right)^T. \end{equation} (you get that by applying the first identity to the minor determinants in the definition for the entries $\mathrm{adj}\left(A\right)_{i,j}$ of the adjungate)

You end up with:

\begin{align} A\cdot\mathrm{adj}\left(A\right)&=\det\left(A\right)\cdot I=\left(\det\left(A\right)\cdot I\right)^{T}\\&=\left(\det\left(A^T\right)\cdot I\right)^{T}=\left(A^T\cdot \mathrm{adj}\left(A^T\right)\right)^T\\&=\mathrm{adj\left(A^T\right)^T}\cdot A=\mathrm{adj\left(A\right)}\cdot A. \end{align}

Max
  • 1,307
3

You know it is true if $\det(A) \ne 0$. But the invertible matrices are dense in the space of all square matrices, so the conclusion follows since the adjoint and determinant are continuous functions.

This works over the field of reals and complex numbers. Do you want other fields as well? Then argue as follows. You can write your identity as $n^2$ expressions in the coefficients of the matrices. Thus you have $n^2$ polynomials over $n^2$ variables you wish to show are identically zero. But the polynomials are zero whenever you substitute in any real numbers. Hence they must be identically zero.

Stephen Montgomery-Smith
  • 26,430
  • 2
  • 35
  • 64
2

Let $A$ be an $n \times n$ matrix, $A_{i,j}$ the $(i,j)$-minor of $A$ and $C_{i,j}$ the $(i,j)$-cofactor of $A$, defined as: $$ C_{i,j} = (-1)^{i+j}A_{i,j}. $$ By definition we know that the adjungate of $A$ is: $$ \operatorname{adj} A = [C_{j,i}]. $$

The cofactor expansion along rows gives for all $i,j=1,\dots,n$: $$ \sum_{k=1}^{n} a_{i,k} C_{j,k} = \delta_{i,j}\det A, $$ and along columns gives for all $i,j=1,\dots,n$: $$ \sum_{k=1}^n a_{k,i}C_{k,j} = \delta_{i,j}\det A, $$ where $\delta_{i,j}$ is the Kronecker delta.

You can express these equations using the definition of the adjungate matrix as the following: $$ A \cdot \operatorname{adj} A = \det A \cdot I_n, $$ and $$ \operatorname{adj} A \cdot A = \det A \cdot I_n, $$ where $I_n = [\delta_{i,j}]$ is the identitiy matrix of size $n \times n$. From here we have that $$ A \cdot \operatorname{adj} A = \operatorname{adj} A \cdot A = \det A \cdot I_n. $$

user153012
  • 12,240
1

Let $A\in M_n(K)$. If $K=\mathbb{C}$, then use the Travis argument. If $K$ is a commutative ring with unity, then use the reference (Bill's answer)

Sylvester's determinant identity

given above by Bigbear.

EDIT. @ user3697301

  1. For $K=\mathbb{C}$, Travis in his comment below, gave the key for a complete proof of (*): $adj(A).A=A.adj(A)=\det(A)I_n$. If you do not work, then you cannot do mathematics.

  2. That is interesting is that (*) holds also when $K$ is a commutative ring with unity and the key of the proof is in the MSE reference above.

  • Travis argument? ring? unity????? I thought this is a simple question and yet, there's not seem to have a simple answer. :\ – Stabilo Oct 05 '15 at 09:48
0

Since $A^{-1}=\frac{1}{\mathrm{det}(A)}\mathrm{adj}(A)$ and $A$ represents a finite dimensional linear operator, so a left inverse is also a right inverse i.e. $AA^{-1}=A^{-1}A$. From this is follows easily that $\mathrm{adj}(A)\cdot A = A\cdot\mathrm{adj}(A)$ by canceling the scalar $\frac{1}{\mathrm{det}(A)}$ from both side.

BigbearZzz
  • 15,084
  • 4
    You cannot simply divide by the determinant if you don't know it to be non-zero! But you can still produce a right answer by just slightly modifying your answer – b00n heT Oct 02 '15 at 09:29
  • One can in particular repair this answer by noting that the identity holds on the (dense) subset of invertible matrices in the set of matrices of a given size and appealing to the continuity of both sides of the identity. – Travis Willse Oct 02 '15 at 09:37
  • 2
    Can I appeal to a universal argument instead, like in this http://math.stackexchange.com/questions/17831/sylvesters-determinant-identity/17837#17837 ? – BigbearZzz Oct 02 '15 at 09:51
  • what if $A$ is singular? – Stabilo Oct 02 '15 at 15:28