3

I have tried to reason, using multilinear forms, the well-known fact that the determinant of a matrix $[A]$ changes its sign if any two columns of $[A]$ are interchanged. I am not confident if my reason is correct however, and would appreciate a nod or a refute. Thanks.

(Disclaimer: I am confident that if my reasoning is correct, then it is unlikely that I am presenting anything novel here. I haven't however found the below-described argument -- using multilinear forms -- over math.stackexchange.com in response to similar queries, and therefore thought of seeking a review.)


Here goes the reasoning:

The definition of the determinant ($det$) of a linear operator on a finite-dimensional vector space ($\S 53$ of Finite-Dimensional Vector Spaces, $2^{\text{nd}}$ Ed., by Paul R. Halmos) implies the following.

If $A$ is any linear operator on any $n$-dimensional vector space $\mathcal V$, if $\{x_1, \cdots, x_n\}$ is any basis in $\mathcal V$, and if $w$ is any alternating $n$-linear form on $\mathcal V$, then $$\tag{1}\det \ A = \frac{w(Ax_1, \cdots, Ax_n)}{w(x_1, \cdots, x_n)}.$$

Under the hypothesis of (1), suppose some matrix $[A]$ together with $\{x_1, \cdots, x_n\}$ defines $A$. Suppose $[A_1]$ is the matrix obtained by interchanging two columns, of $[A]$, having any indices $h, k \ (h \neq k, 1 \leq h \leq n, 1 \leq k \leq n)$. Let $A_1$ be the operator defined by $[A_1]$ together with $\{x_1, \cdots, x_n\}$.

It follows from the definition of a matrix ($\S$37 of Finite-Dimensional Vector Spaces, $2^{\text{nd}}$ Ed., by Paul R. Halmos) that $Ax_h = A_1 x_k$, $Ax_k = A_1 x_h$, and that $Ax_i = A_1 x_i$ for $i = 1, \cdots, n$ except when $i$ equals $h$ or $k$. This finding together with the skew-symmetric nature of $w$ ($\S$30 of Finite-Dimensional Vector Spaces, $2^{\text{nd}}$ Ed., by Paul R. Halmos) implies that $$w(A_1 x_1, \cdots, A_1 x_n)= -w(A x_1, \cdots, A x_n).$$

Accordingly, we have $$ \det \ A_1 = \frac{w(A_1x_1, \cdots, A_1x_n)}{w(x_1, \cdots, x_n)} = \frac{-w(Ax_1, \cdots, Ax_n)}{w(x_1, \cdots, x_n)} = -\det \ A.$$

It is clear therefore that $\det\ [A_1] = -\det\ [A]$.

  • 1
    What does "skew-symmetric nature of $w$" means (usually skew-symmetriness is defined for matrices, not for forms)? Anyway, if you know that permuting two columns of $[A]$ changes sign of $w$, then your proof looks good. – mihaild Dec 24 '21 at 14:41
  • 1
    In my experience, the $det$ is defined as the unique element of $\Lambda^n((\mathbb{R}^n)^*)$ with $\det(e_1, \dots, e_n) = 1$. In other words, $\det$ is the unique alternating multilinear map from $M(n, \mathbb{R})$ to $\mathbb{R}$ that satisfies $\det(I) = 1$. So the alternating property is part of the definition of det. – Mason Dec 31 '21 at 01:33
  • @mihaild For reference, $\S 30.$ Alternating forms of Finite-Dimensional Vector Spaces (Second Edition) by Paul R. Halmos defines skew-symmetry for $n$-linear forms. – AMathStudent Dec 31 '21 at 05:37

1 Answers1

2

Yay, that is my favourite definition of the determinant haha.

Your reasoning is allright, but you could be more precise. Suppose for example that you swapped columns $1$ and $2$. Then you can show that $A_1x_1=Ax_2$, that $A_1x_2=Ax_1$ and $A_1x_i=Ax_i$ for each $i\neq 1,2$. Similarly picking any two adjacent columns $j,j+1$.

Jackozee Hakkiuz
  • 5,583
  • 1
  • 14
  • 35