3

Interchanging two columns of a square matrix changes the sign of the determinant. I know that this is true, and I do understand how it works. But is there any proof for this statement?

4 Answers4

4
  • Determinant of the matrix $$J = \begin{bmatrix}0&1\\1&0\end{bmatrix}$$ is $-1$.

  • Determinant of block diagonal matrix $\det{\textrm{diag}(A,B)}$ is $(\det A)\times(\det B)$

  • determinant of a product of two matrices $AB$ is again $(\det A)\times(\det B)$

Now a row swap is equivalent to

$$ \begin{bmatrix} I_k & & & \\ &0&1& \\ &1&0& \\ & & &I_l \end{bmatrix}A = PA $$

The determinant of $P$ is first by using the second property $\det(I_k)\det(J)\det(I_l)$. Now by using the first identity we have $\det(P) = 1\times-1\times1 = -1$.

Now using the third property $\det(PA) = -\det(A)$.

If you work from the right side of $A$ this corresponds to a column swap.

percusse
  • 562
3

Any proof of this result depends on a definition of determinant. Let us define it using permutations: $\det(A) = \sum_{\tau \in S_n}\operatorname{sgn}(\tau)\,a_{1,\tau(1)}a_{2,\tau(2)} \ldots a_{n,\tau(n)},\;$ where the sum is over all $n!$ permutations of the columns by elements in the symmetric group $S_n.\;$ See the question about a determinant definition.

Let $A^\sigma$ be the result of rearranging the columns of $A$ using a permutation $\sigma.\;$ This replaces all the $\tau$ in the summation by $\sigma\tau$, the product of two permutations. Now $\;\operatorname{sgn}(\sigma\tau)=\operatorname{sgn}(\sigma)\operatorname{sgn}(\tau)\;$ and, by distributivity, the common $\operatorname{sgn}(\sigma)$ comes out of the summation. Thus, $\;\det(A^\sigma)=\operatorname{sgn}(\sigma)\det(A).$

In our case, interchanging any two columns is a transposition and these all have signature $-1$, and so multiplying the determinant by $-1$ changes its sign. QED.

Somos
  • 35,251
  • 3
  • 30
  • 76
0

Yes, there are serval proofs. Proofs are kind of messy and long. Here is one from a UCSD website . http://www.math.ucsd.edu/~ebender/Supplements/det.pdf

Another from proof wiki https://proofwiki.org/wiki/Effect_of_Elementary_Row_Operations_on_Determinant

BR Pahari
  • 2,694
0

heuristically: if you take the determinant to be the signed volume of the unit cube in $\mathbb R^n$ under a linear transformation, switching columns changes the orientation of the image, so you should expect the determinant to reflect this.

My favorite reason is that the determinant is a homomorphism for invertible matrices, but really just multiplicative: $det(AB)=det(A)det(B)$.

In particular, switching columns corresponds to right multiplication by an elementary matrix $T_{ij}$) and $det(T_{ij})=-1$, so we have that $$det(A T_{ij})=det(A)det(T_{ij})=-det(A).$$

The result can be shown straight from the Leibniz formula, but I think these proofs are kind of hard to read and a little ugly.

Andres Mejia
  • 20,977