1

Let $k \leq n$ denote a pair of fixed but arbitrary natural numbers.

Definition 0. Write $\varphi$ for the unique $\mathbb{R}$-linear function $$\Lambda^k\mathbb{R}^n \rightarrow \mathbb{R}$$ such that if $e$ is an element of the standard basis for $\Lambda^k\mathbb{R}^n,$ then $\varphi(e) = 1.$

Definition 1. Given a sequence of elements of $\mathbb{R}^n$ with $k$-many terms, call it $x$, define that the determinant of $x$ is given as follows: $$\mathrm{det}(x) = \varphi(x_0 \wedge \ldots \wedge x_{k-1})$$

I imagine that this has the following geometric interpretation: $\mathrm{det}(x)$ is (probably) the signed $k$-area of the $k$-dimensional parallelepiped generated by the terms of $x$. If this actually works as expected, then it gives us a way to find the determinant of a non-square matrix, by regarding the columns of that matrix as vectors. Note, however, that this generalized determinant cannot possibly satisfy the requirement that for all matrices $A$ and $B$ such that the composite $AB$ is well-defined, we have $\mathrm{det}(AB) = \mathrm{det}(A)\mathrm{det}(B).$ See here for a counterexample.

Anyway:

Question. Is this a thing? If so, where can I learn more?

goblin GONE
  • 67,744
  • 3
    What makes sense, as far as I know, is the minor determinants you can extract from an $n \times p$ matrix. They can serve as coordinates in Grassmannians (n,k) (see definition on Wikipedia), the most ancient use of them being Plücker coordinates. Look also at what is called "Schubert calculus", etc. By the way, I just found a good lecture on stuff around your question: http://brickisland.net/cs177/?p=200 – Jean Marie Feb 07 '16 at 23:08
  • You should also have a look at what is called "geometric algebra" which is a re-interpretation of many usual geometric entities. Another keyword is:"Linear Multiport" with applied orientations ; I enjoyed for example a conference paper entitled "Differentiable manifolds and Engineering Black-box models for Linear Physical systems" by Rainer PAULI (MTNS 2000, Perpignan, France) that can be found at www.math.ucsd.edu/~helton/.../CDROM/.../B263.pdf – Jean Marie Feb 08 '16 at 08:20

1 Answers1

1

Note that if $k = 1$ then your linear functional is $\varphi \colon \mathbb{R}^n \rightarrow \mathbb{R}$ given by $\varphi(x^1, \ldots, x^n) = x^1 + \ldots x^n$ and it doesn't give the one-dimensional signed area (length in this case) of $(x^1, \ldots, x^n)$. In general, you can't expect to be able to describe the "signed length" of a vector $x \in \mathbb{R}^n$ by a linear functional $\varphi \colon \mathbb{R}^n \rightarrow \mathbb{R}$ as it has a kernel of dimension $\geq n - 1$.

However, there is a construction that generalizes the (absolute value of the) determinant in some sense and results in the non-signed $k$-area of the $k$-dimensional parallelepiped generated by the vectors $v_1, \ldots, v_k$. Let $V$ be a finite dimensional vector space and endow $V$ with an inner product $\left< \cdot, \cdot \right>$ so that you can talk about lengths of vectors in $V$. The inner product $\left< \cdot, \cdot \right>$ extends naturally to an inner product on $\Lambda^k(V)$ defined on simple $k$-vectors by

$$ \left< v_1 \wedge \dots \wedge v_k, w_1 \wedge \dots \wedge w_k \right> := \det \left( \left< v_i, w_j \right> \right)_{i,j=1}^k $$

and extended multi-linearly. The matrix $G(v_1, \dots, v_k) = ( \left< v_i, v_j \right>)_{i,j=1}^k$ is called the Gram matrix of $(v_1, \dots, v_k)$ and the norm $||v_1 \wedge \dots \wedge v_k|| = \sqrt{\det G(v_1, \dots, v_k)}$ gives the signed $k$-area of the $k$-dimensional parallelepiped generated by the vectors $v_1, \dots, v_k$ and is zero if and only if the vectors $v_1, \dots, v_k$ are linearly dependent (in which case $v_1 \wedge \dots \wedge v_k = 0$).

If $V = \mathbb{R}^n$ with the standard inner product, then if we treat the vectors $v_1, \ldots, v_k$ are columns of the matrix $A \in M_{n \times k}(\mathbb{R})$, then $G(v_1, \ldots, v_k) = A^T A$ and $||v_1 \wedge \dots \wedge v_k||^2 = \det(A^T A)$. In particular, if $k = n$ then $||v_1 \wedge \dots \wedge v_n||^2 = \det(A^T A) = \det(A)^2$ so $||v_1 \wedge \dots \wedge v_n|| = |\det(A)|$.

A construction of different nature generalizing the determinant (including signs) is obtained by the $k$-th exterior product of a linear map $T \colon V \rightarrow W$, resulting in a linear map $\Lambda^k(T) \colon \Lambda^k(V) \rightarrow \Lambda^k(W)$. If $V = W$ and $k = \dim V$, then $\Lambda^n(V)$ is one dimensional and under a choice of orientation, $\Lambda^n(V)$ can be identified with a single scalar called the determinant of $T$. If you apply this to a non-square matrix (interpreted as a linear map) $A \in M_{l \times n}$, then the components of $\Lambda^k(A)$ with respect to bases induced on $\Lambda^k(\mathbb{R}^n)$ and $\Lambda^k(\mathbb{R}^l)$ from the standard bases will be the $k \times k$ minors of $A$. You can learn about it more from the lecture notes of Paul Garrett here.

levap
  • 65,634
  • 5
  • 79
  • 122
  • The norm on $\Lambda^k \mathbb{R}^n$ is computed in the obvious way, right? For example, if $k=2$ and $n=3$, does the following hold? $$|a(e_1 \wedge e_2)+b(e_1\wedge e_3)+c(e_2 \wedge e_3)| = \sqrt{a^2+b^2+c^2}$$ – goblin GONE Feb 14 '16 at 13:21
  • Yes. If you start with an orthonormal basis $(v_1, \dots, v_n)$ for $V$ then $v_{i_1} \wedge \dots \wedge v_{i_k}$ for $i_1 < \dots i_k$ form an orthonormal basis for $\Lambda^k(V)$ so given a general vector $v$ in $\Lambda^k(V)$ such that you know the coefficients of $v$ with respect to $v_{i_1} \wedge \dots \wedge v_{i_k}$, you compute the norm of $v$ using the Pythagoras theorem. In your case, if you work with the standard inner product on $\mathbb{R}^n$ and the $e_i$ are the standard basis vectors, you indeed obtain your formula. – levap Feb 15 '16 at 15:49
  • Sweet. Thanks. $;!$ – goblin GONE Feb 16 '16 at 11:54
  • This does not give the sign of this generalized determinant, it is always positive. Can the sign to be defined? – shuhalo Sep 15 '22 at 08:07