5

I was told that the determinant of a square matrix can be expanded along any row or column and given a proof by expanding in all possible ways, but only for square matrices of order 2 and 3.

  • Is a general proof for any order even possible ?
  • If so, how is this done ?
  • On a similar note, how can we prove the various properties of determinants for square matrices for any order like the following:

    • Swap two rows/columns and all we get is a minus sign as a result.
    • $R_1 \to R_1+ aR_2$ does not change the determinant.
    • Determinant of the transpose is the same as the determinant of the original matrix.
Martin Argerami
  • 205,756
Truth-seek
  • 1,427
  • What is the determinant definition you know for square matrices of order greater than 3 and 4? – Elias Costa Apr 27 '17 at 12:15
  • 1
    There is some wonderful multilinear algebra called "exterior powers" that makes the properties of determinants clear. Unfortunately it's not really appropriate for a first course in linear algebra, and it might even be fine to just take this for granted for now. You might also be interested in this answer – Joppy Apr 27 '17 at 12:16
  • @MathOverview He's probably seen it defined directly via the Laplace expansion, paired with the "miracle theorem" that the result is independent from the order of expansion. In Italy it is a common high school approach. –  Apr 27 '17 at 12:16
  • @MathOverview Yep, the laplace expansion indeed. At least that's the way its presented in my book. – Truth-seek Apr 27 '17 at 12:20
  • @G.Sassatelli Btw, What is the "miracle theorem"? Not able to find it anywhere. – Truth-seek Apr 27 '17 at 12:23
  • @MathEnthusiast What are the various determinant properties you want to prove? List them. Your question is unclear if you do not specify which properties you want to prove. – Elias Costa Apr 27 '17 at 12:24
  • @MathEnthusiast Proofs with matrices have a drawback with notation. So that the tests do not get too long and fancy an "economic" notation is necessary, which is difficult for a principle "swallow". – Elias Costa Apr 27 '17 at 12:36
  • @MathOverview Invent your own notation for the answer, i guess [Maybe it will make the answer more interesting]. – Truth-seek Apr 27 '17 at 12:45
  • @G.Sassatelli Is the "miracle theorem" a real theorem or was it a sarcastic remark? – Truth-seek Apr 27 '17 at 12:49
  • No, it's not a sarcastic remark. It's just the way I used to call the fact that the result of the Laplace expansion is independent from the order of expansion, back in the day. Now I know a proof. In hindsight, it is not difficult, but certainly lengthy and it requires a couple of facts and notations that might get over the layman's head. –  Apr 27 '17 at 12:54

2 Answers2

3

Here is one possible path. We define the determinant recursively:

  • if $A$ is $1\times 1$, let $\det A=A$;

  • If $A$ is $(n+1)\times (n+1)$, let $$ \det A=\sum_{k=1}^{n+1} (-1)^{k+1}A_{1k}\,M_{1k}^A, $$ where $M_{st}^A$ is the determinant of the $n\times n$ matrix obtained by removing the $s^{\rm th}$ row and the $t^{\rm th}$ column of $A$.

Now,

  1. Show that if $B$ is obtained from $A$ by multiplying a row by $\alpha$, then $$\det B=\alpha\,\det A.$$ This is done by induction very easily.

  2. Show that if we have $A,B,C$ with $A_{rj}=B_{rj}+C_{rj}$ for all $j$, and $A_{kj}=B_{kj}=C_{kj}$ when $k\ne r$ and for all $j$, then $$\det A=\det B+\det C.$$ Again this is done by induction. When $r=1$ the equality follows trivially from the definition of determinant (as the minors of $A,B,C$ will be all equal) and when $r\ne 1$ we use induction.

  3. Show that if $B$ is obtained from $A$ by swapping two rows, then $$\det B=-\det A.$$ Here one first swaps rows $1$ and $r$, and then any other swapping of two rows $r$ and $s$ can be achieved by three swaps ($r$ to $1$, $s$ to $1$, $r$ to $1$). This can be used to show that one can calculate the determinant along any row (swap it with row 1, calculate, unswap).

  4. It now follows that if $A$ has two equal rows, then $\det A=0$ (because $\det A=-\det A$).

  5. If $B_{rj}=A_{rj}+\alpha A_{sj}$, and $B_{kj}=A_{kj}$ when $k\ne r$, then by 1. and 2., $$\det B=\det A+\alpha\det C,$$ where $C$ is the matrix equal to $A$ but with the $s$ row in place of the $r$ row; by 4., $\det C=0$, so $\det B=\det A$.

  6. Now one considers the elementary matrices, and checks directly (using the above properties) that for any elementary matrix $E$, $$\det EA=\det E\,\det A.$$

  7. If $B$ is invertible, then $B$ can be written as a product of elementary matrices, $B=E_1E_2\cdots E_m$, and so \begin{align} \det BA&=\det E_1E_2\cdots E_m A=\det E_1\det E_2\cdot\det E_m\det A\\ \ \\ &=\det (E_1\cdots E_m)\det A=\det B\det A. \end{align} Similarly, $\det AB=\det A\det B$.

  8. If neither $A$ nor $B$ are invertible: then $AB$ is not invertible either. For a non-invertible matrix, its Reduced Row Echelon form has a row of zeroes, and so its determinant is zero; as we can move to $A$ by row operations, it follows that $\det A=0$; similarly, $\det AB=0$. So $$\det AB=\det A\det B$$ also when one of them is not invertible.

  9. Knowing that det is multiplicative, we immediately get that, when $A$ is invertible, $$\det A^{-1}=\frac1{\det A}.$$

  10. For an arbitrary matrix $A$, it is similar to its Jordan form: $A=PJP^{-1}$. Then $$ \det A=\det (PJP^{-1})=\det P\,\det J\,\frac1{\det P}=\det J. $$ As $J$ is triangular with the eigenvalues of $A$ (counting multiplicities) in its diagonal, we get that $$ \det A=\lambda_1\cdots\lambda_n, $$ where $\lambda_1,\ldots,\lambda_n$ are the eigenvalues of $A$, counting multiplicities.

  11. Since the eigenvalues of $A^T$ are the same as those from $A$, we get $$ \det A^T=\det A. $$

  12. Now, everything we did for rows, we can do for columns by working on the transpose. In particular, we can calculate the determinant along any column.

Martin Argerami
  • 205,756
2

Just use a suitable notation to prove the various properties of the determinant.

For $I=\{1,\ldots, i,\ldots,n\}$ and $J=\{1,\ldots, j,\ldots,n\}$ fix the index set notation: $I_i=I-\{i\}$ for $i=1,2,\ldots,n$ and $J_j=J-\{j\}$ for $j=1,2,\ldots,n$. Now fix the index notation for matrices: $$ M= \left\lgroup M_{ij} \right\rgroup_{\substack{ i\in I\\j\in J}}= \begin{pmatrix} M_{11} &\ldots & M_{1j-1}\;M_{1j}\;M_{1j+1}&\ldots & M_{1n}\\ \vdots & &\vdots & &\vdots \\ M_{i-11} &\ldots & M_{i-1j-1}\; M_{i-1j}\; M_{i-1j+1}&\ldots & M_{i-1n}\\ M_{i1} &\ldots & M_{ij-1}\; M_{ij}\; M_{ij+1}&\ldots & M_{in}\\ M_{i+11} &\ldots & M_{i+1j-1}\; M_{i+1j}\; M_{i+1j+1}&\ldots & M_{i+1n}\\ \vdots & &\vdots & &\vdots \\ M_{n1} &\ldots & M_{nj-1}\;M_{nj}\;M_{nj+1}&\ldots & M_{nn}\\ \end{pmatrix}_{n\times n} $$

$$ \left\lgroup M_{uv} \right\rgroup_{\substack{ u\in I_i\\v\in J_j}} = \begin{pmatrix} M_{11} &\ldots & M_{1j-1}\;M_{1j+1}&\ldots & M_{1n}\\ \vdots & &\vdots & &\vdots \\ M_{i-11} &\ldots & M_{i-1j-1}\; M_{i-1j+1}&\ldots & M_{i-1n}\\ M_{i+11} &\ldots & M_{i+1j-1}\; M_{i+1j+1}&\ldots & M_{i+1n}\\ \vdots & &\vdots & &\vdots \\ M_{n1} &\ldots & M_{nj-1}\;M_{nj+1}&\ldots & M_{nn}\\ \end{pmatrix}_{(n-1)\times (n-1)} \\ \left\lgroup M_{uv} \right\rgroup_{\substack{ u\in I_i\\v\in J_j}} = \begin{pmatrix} M_{11} &\ldots & M_{1v}&\ldots & M_{1 n-1}\\ \vdots & &\vdots & &\vdots \\ M_{u1} &\ldots & M_{uv}&\ldots & M_{u n-1}\\ \vdots & &\vdots & &\vdots \\ M_{n-1 1} &\ldots & M_{n-1v}&\ldots & M_{n-1n-1}\\ \end{pmatrix}_{(n-1)\times (n-1)} $$ Then the expansion on line $i$ is $$ \det \left\lgroup M_{ij} \right\rgroup_{\substack{i\in I\\ j\in J}} = \sum_{j=1}^{n} M_{ij}(-1)^{i+j}\det \left\lgroup M_{uv} \right\rgroup_{\substack{u\in I_i\\ v\in J_j}} $$ and the expansion on column $j$ is $$ \det \left\lgroup M_{ij} \right\rgroup_{\substack{i\in I\\ j\in J}} = \sum_{i=1}^{n} M_{ij}(-1)^{i+j}\det \left\lgroup M_{uv} \right\rgroup_{\substack{u\in I_i\\ v\in J_j}} $$

Elias Costa
  • 14,658