How can I prove that a swap matrix (a matrix which multiplied with another, swaps a pair of its rows/columns) has determinant $-1$?
-
Welcome to stackexchange. You are more likely to get answers rather than votes to close or downvotes if you edit the question to show what you tried and where you are stuck. Hint: write down the swap matrices in the $2 \times 2$ case, and a swap matrix for $3 \times 3$. – Ethan Bolker Mar 17 '18 at 14:34
-
1Can you clarify the definition of swap matrix? If you take the null matrix and multiply it by an appropriate elementary matrix you have a counter example. – Git Gud Mar 17 '18 at 14:43
-
I believe that these are more commonly referred to as “permutation matrices.” See https://math.stackexchange.com/questions/473240/determinant-of-permutation-matrix-elementary-matrix-of-type-2 – user328442 Mar 17 '18 at 15:02
1 Answers
There are a few ways to define the determinant, and the proof changes accordingly.
Eigenvalues
The determinant is defined to be the product of the (complex) eigenvalues to the power of their multiplicities (the dimension of the corresponding generalised eigenspace).
Let $e_i$ be the $i$th standard basis vector. In this case, if the $n \times n$ swap matrix swaps the $i$th and $j$th column ($i < j$), then the vectors $$e_1, \ldots, e_{i-1}, e_{i+1}, \ldots, e_{j-1}, e_{j+1}, \ldots, e_n$$ are all linearly independent eigenvectors corresponding to eigenvalue $1$, as is the vector $e_i + e_j$. Meanwhile, the vector $e_i - e_j$ is an eigenvector corresponding to eigenvalue $-1$.
Hence, we have eigenvalue $1$ with multiplicity at least $n - 1$, and eigenvalue $-1$ with multiplicity at least $1$. Since we have found at least $n$ linearly independent eigenvalues, we have a complete list of eigenvalues and their multiplicities. Hence, the determinant is $1^{n - 1} \cdot (-1)^1 = -1$.
Cofactor Expansion
We can use the fact that cofactor expansions can be made along any row or column, with an appropriate change of sign. Note that the diagonal entries are always counted positively. Note also that, expanding along a row that isn't being swapped, the diagonal $1$ is the only non-zero entry, and the cofactor is a swap matrix of a smaller dimension. For example, expanding along the 3rd row: $$\begin{vmatrix}1 & 0 & 0 & 0 \\ 0 & 0 & 0 & 1 \\ 0 & 0 & 1 & 0 \\ 0 & 1 & 0 & 0\end{vmatrix} = \begin{vmatrix}1 & 0 & 0 \\ 0 & 0 & 1 \\ 0 & 1 & 0\end{vmatrix}.$$ By inductively reducing, you get to the $2 \times 2$ case, which can be computed easily.
Leibniz Formula
The determinant of an $n \times n$ matrix $(a_{i,j})_{i,j = 1}^n$ can be defined as follows: $$\sum_{\sigma \in S_n} \operatorname{sgn}(\sigma) \prod_{i=1}^n a_{i, \sigma(i)},$$ where $\operatorname{sgn}(\sigma)$ returns $1$ when $\sigma$ is even, and $-1$ when $\sigma$ is odd. Note that the swap matrix can be expressed as a permutation matrix: $$a_{i,j} = \delta_{i, \tau(i)},$$ where $\delta_{i, j}$ returns $1$ when $i = j$ or $0$ otherwise (i.e. the entries of the identity matrix), and $\tau$ is a transposition. Therefore, multiplying a permutation by $\tau$ causes it to change from odd to even. Hence, \begin{align*}\sum_{\sigma \in S_n} \operatorname{sgn}(\sigma) \prod_{i=1}^n a_{i, \sigma(i)} &= \sum_{\sigma \in S_n} \operatorname{sgn}(\sigma) \prod_{i=1}^n \delta_{i, \sigma \circ \tau (i)} \\ &= - \sum_{\sigma \in S_n} \operatorname{sgn}(\sigma \circ \tau) \prod_{i=1}^n \delta_{i, \sigma \circ \tau (i)} \\ &= - \sum_{\sigma \in S_n} \operatorname{sgn}(\sigma) \prod_{i=1}^n \delta_{i, \sigma(i)} \\ &= - \operatorname{det}(I) = -1. \end{align*}

- 50,900
-
For the record, I don't really like either of the answers to the linked question, as I see the result as somewhat circular. The typical way to work with the elementary row operations is to treat them as multiplication on the left by elementary matrices. It's from this that we typically derive the results about how elementary row operations affect determinants. We know that a row swap negates a determinant precisely because the row swap matrices have determinant $-1$. – Theo Bendit Mar 17 '18 at 15:22