0

Suppose $A$ is a matrix. The $(i, j)$-entry of $A$ is denoted as $[A]_{i,j}$.

Suppose that $a_1$, $a_2$, $\dots$, $a_n$ are $m \times 1$ matrices. Then $[a_1, a_2, \dots, a_n]$ is the $m \times n$ matrix whose $(i, j)$-entry is equal to $[a_j]_{i,1}$. The piece of notation allows one to display a matrix using its columns.

Determinants are defined here.

How does one prove the alternating property?

Suppose that $q$ is a positive integer less than or equal to $n$. Suppose that $p$ is a positive integer less than $q$. Suppose that $a_1$, $a_2$, $\dots$, $a_n$ are $n \times 1$ matrices. Suppose that $a_p = a_q$. Then $$ \det {[a_1, a_2, \dots, a_n]} = 0. $$

Let the proposition $P(n)$ be as follows:

For any $n \times 1$ matrices $a_1$, $a_2$, $\dots$, $a_n$, any positive integer $q$ less than or equal to $n$, any positive integer $p$ less than $q$, that $a_p = q_q$ implies that $$ \det {[a_1, a_2, \dots, a_n]} = 0. $$

It is easy to check that $P(2)$ is true: $$ \det {\begin{bmatrix} a & a \\ b & b \\ \end{bmatrix}} = ab - ba = 0. $$

Suppose that $P(n-1)$ is true (in which $n \geq 3$). I need to show that $P(n)$ is true.

I know that it suffices to prove that the determinant of a square matrix with two adjacent equal columns is zero. Suppose that one has proved it. Using the multilinear property, one can prove that swapping two adjacent columns changes the sign of the determinant: $$ \begin{align*} 0 = {} & \det {[\dots, \overset{\text{column}\,j}{b_j + b_{j+1}}, b_j + b_{j+1}, \dots]} \\ = {} & \det {[\dots, b_j, b_j + b_{j+1}, \dots]} + \det {[\dots, b_{j+1}, b_j + b_{j+1}, \dots]} \\ = {} & \hphantom{{} + {}} ( \det {[\dots, b_j, b_j, \dots]} + \det {[\dots, b_j, b_{j+1}, \dots]} ) \\ & + ( \det {[\dots, b_{j+1}, b_j, \dots]} + \det {[\dots, b_{j+1}, b_{j+1}, \dots]} ) \\ = {} & \det {[\dots, b_j, b_{j+1}, \dots]} + \det {[\dots, b_{j+1}, b_j, \dots]}, \end{align*} $$ in which $b_1$, $b_2$, $\dots$, $b_n$ are $n \times 1$ matrices.

Suppose that $p < q$. Suppose that column $p$ of the $n \times n$ matrix $A$ is equal to column $q$ of $A$. Let $$ B = [\dots, \overset{\text{column}\,p-1}{a_{p-1}}, \overset{\text{column}\,p}{a_{p+1}}, \dots, a_{q-1}, \overset{\text{column}\,q-1}{a_p}, \overset{\text{column}\,q}{a_q}, a_{q+1}, \dots], $$ which can be obtained from $A$ by swapping two adjacent columns $q-p\color{red}{{}-1}$ times. Hence $$ 0 = \det {(B)} = (-1)^{q-p\color{red}{{}-1}} \det {(A)}, $$ which means that $\det {(A)} = 0$.

However, I do not know how to prove the special case.

Juliamisto
  • 1,300
  • Could your number of column swaps carefully, we have to get there and back, which results in an odd number of swaps. For example, if you want to swap 1 and 3, then we have $abc \rightarrow BAc \rightarrow b CA \rightarrow CBa $. It is not $ 3 - 1 + 1 = 2$ times. – Calvin Lin Jun 19 '23 at 04:48
  • @CalvinLin One does not have to get back here. (Because I did not want to swap the two equal columns.) Suppose that it has been proved that the determinant of a square matrix with two equal adjacent columns is zero. Suppose, for example, that columns $2$ and $5$ of $A$ are equal. One has $(1,2,3,4,5) \to (1,3,2,4,5) \to (1,3,4,2,5)$. The last matrix has two equal adjacent columns, so its determinant is zero. Hence $(-1)^{5-2\color{red}{{}-1}} \det {(A)} = 0$. (It is not $5 - 2 + 1$.) If one wants to prove the antisymmetric property, then it is necessary to get there and back. – Juliamisto Jun 19 '23 at 04:55

3 Answers3

0

This proof is direct, which is based on the fact that the determinant of a square matrix can be computed by expansion about any column.

I will prove that $P(n)$ (in the description of the question) is true for $n = 2$, $3$, $\dots$ by mathematical induction.

It is easy to check that $P(2)$ is true.

Suppose that $P(n-1)$ is true (in which $n \geq 3$). I will prove that $P(n)$ is true under the hypothesis.

Choose a positive integer $q$ less than or equal to $n$. Choose a positive integer $q$ less than $p$. Suppose that column $p$ of the $n \times n$ matrix $A$ is equal to column $q$ of $A$.

Choose a positive integer $u$ less than or equal to $n$ so that $u \neq p$ and that $u \neq q$. Note that $A(i|u)$ has two equal columns (for $i = 1$, $2$, $\dots$, $n$). By the hypothesis, $\det {(A(i|u))} = 0$. Hence $$ \begin{aligned} \det {(A)} = {} & \sum_{i = 1}^{n} {(-1)^{i+u} [A]_{i,u} \det {(A(i|u))}} \\ = {} & \sum_{i = 1}^{n} {(-1)^{i+u} [A]_{i,u} \,0} \\ = {} & 0. \end{aligned} $$

Hence by mathematical induction, $P(n)$ is true for $n = 2$, $3$, $\dots$.

Juliamisto
  • 1,300
0

The proof uses the fact that the determinant of a square matrix can be computed by expansion about the first two columns: $$ \begin{aligned} \det {(A)} = \sum_{1 \leq i < k \leq n} {\det { \begin{bmatrix} [A]_{i,1} & [A]_{i,2} \\ [A]_{k,1} & [A]_{k,2} \\ \end{bmatrix} } (-1)^{i+k+1+2} \det {(A(i,k|1,2))}}. \end{aligned} $$ Here is a proof, which just uses the definition.

I will prove that $P(n)$ (in the description of the question) is true for $n = 2$, $3$, $\dots$ by mathematical induction.

It is easy to check that $P(2)$ is true.

Suppose that $P(n-1)$ is true (in which $n \geq 3$). I will prove that $P(n)$ is true under the hypothesis.

Let $A = [a_1, a_2, \dots, a_n]$ be an $n \times n$ matrix with two equal columns.

As is mentioned in the description of the question, it suffices to prove the determinant of a square matrix with two adjacent equal columns is zero.

Let $p$ be a positive integer less than $n$. Put $q = p + 1$. Let $[A]_{i,p} = [A]_{i,q}$.

Suppose first that $p > 1$. Then columns $p-1$, $p$ of $A(i|1)$ are equal. By the hypothesis, $\det {(A(i|1))} = 0$. Hence $$ \begin{aligned} \det {(A)} = {} & \sum_{i = 1}^{n} {(-1)^{i+1} [A]_{i,1} \det {(A(i|1))}} \\ = {} & \sum_{i = 1}^{n} {(-1)^{i+1} [A]_{i,1} \,0} \\ = {} & 0. \end{aligned} $$

Suppose then that $p = 1$. Then $$ \begin{aligned} \det {(A)} = {} & \sum_{1 \leq i < k \leq n} {\det { \begin{bmatrix} [A]_{i,1} & [A]_{i,2} \\ [A]_{k,1} & [A]_{k,2} \\ \end{bmatrix} } (-1)^{i+k+1+2} \det {(A(i,k|1,2))}} \\ = {} & \sum_{1 \leq i < k \leq n} {\det { \begin{bmatrix} [A]_{i,1} & [A]_{i,1} \\ [A]_{k,1} & [A]_{k,1} \\ \end{bmatrix} } (-1)^{i+k+1+2} \det {(A(i,k|1,2))}} \\ = {} & \sum_{1 \leq i < k \leq n} {0\, (-1)^{i+k+1+2} \det {(A(i,k|1,2))}} \\ = {} & 0. \end{aligned} $$

Hence by mathematical induction, $P(n)$ is true for $n = 2$, $3$, $\dots$.

Juliamisto
  • 1,300
0

The proof is based on the fact that the determinant of a square matrix can be computed by expansion about the first row.

I will prove that $P(n)$ (in the description of the question) is true for $n = 2$, $3$, $\dots$ by mathematical induction.

It is easy to check that $P(2)$ is true.

Suppose that $P(n-1)$ is true (in which $n \geq 3$). I will prove that $P(n)$ is true under the hypothesis.

Let $A = [a_1, a_2, \dots, a_n]$ be an $n \times n$ matrix with two equal columns.

As is mentioned in the description of the question, it suffices to prove the determinant of a square matrix with two adjacent equal columns is zero.

Let $p$ be a positive integer less than $n$. Put $q = p + 1$. Let $[A]_{i,p} = [A]_{i,q}$.

Note that $[A]_{1,p} = [A]_{1,p+1}$. Note that $A(1|p) = A(1|p+1)$. Note that for $\ell \neq p, p+1$, $A(1|\ell)$ has two equal columns. By the hypothesis, $\det {(A(1|\ell))} = 0$. Hence $$ \begin{aligned} & \det {(A)} \\ = {} & \sum_{k = 1}^{n} {(-1)^{1 + k} [A]_{1,k} \det {(A(1|k))}} \\ = {} & \hphantom{{} + {}} (-1)^{1 + p} [A]_{1,p} \det {(A(1|p))} + (-1)^{1 + p+1} [A]_{1,p+1} \det {(A(1|p+1))} \\ & + \sum_{\substack{1 \leq k \leq n \\k \neq p, p+1}} {(-1)^{1 + k} [A]_{1,k} \det {(A(1|k))}} \\ = {} & \hphantom{{} + {}} (-1)^{1 + p} [A]_{1,p} \det {(A(1|p))} + (-1)^{1 + p+1} [A]_{1,p} \det {(A(1|p))} \\ & + \sum_{\substack{1 \leq k \leq n \\k \neq p, p+1}} {(-1)^{1 + k} [A]_{1,k}\,0 } \\ = {} & (-1)^{1 + p} [A]_{1,p} \det {(A(1|p))} - (-1)^{1 + p} [A]_{1,p} \det {(A(1|p))} \\ = {} & 0. \end{aligned} $$

Hence by mathematical induction, $P(n)$ is true for $n = 2$, $3$, $\dots$.

Juliamisto
  • 1,300