2

A standard established way to construct the determinant is to first construct $\Lambda^p(V)$ and then observe that an endomorphism $A: V \rightarrow V$ induces $v_1 \wedge \ldots \wedge v_n \mapsto A(v_1) \wedge \ldots A(v_n)$ on $\Lambda^n(V)$ which reduces to $D v_1 \wedge \ldots \wedge v_n$ for a scalar $D$.

This is all fine, but in case of general vector spaces $V$, what rationale do we have for constructing $\Lambda^p(V)$ in first place?

For an inner product space on $\mathbb{R}^n$ with the induced norm and metric, all linear isometries are orthogonal transformations, and we can deeply show using e.g. algebraic topology, that $O(n)$ has exactly two connected components, as discussed in this excellent answer. We naturally interpret these as two classes of orientations.

However, even in a metric space $(\mathbb{R}^n,d)$ with $d$ induced by an arbitrary norm, in general, the isometry group is not $O(n)$, and may have a different number of connected components. At this point, the notion of 'orientation' becomes ambiguous and ceases to 'naturally lead us' to constructing $\Lambda^p(V)$.

Are there alternative, deeper, more revealing ways to think about, motivate and construct $\Lambda^p(V)$ for general $V$ as a first step of constructing the determinant, other than the mechanical and rather uninformative effort to construct 'volume with generalized orientation' ?

user
  • 649

2 Answers2

6

I think that there are a number of reasons to define $\bigwedge^p V$ for arbitrary vector spaces $V$, over arbitrary fields (and even for modules over commutative rings).

(1) First of all, even though these may not be the deepest motivations, defining wedge powers gives one of the ways to develop determinants; and for vector spaces over $\mathbb{R}$, wedge powers give one of the ways to talk about orientations. Obviously those are important topics, for you and for everyone else.

(2) Second, experience shows that tensors play an important role in many parts of mathematics, physics, engineering... It is then natural to identify "special" kinds of tensors. One obvious class is symmetric tensors, those tensors which remain invariant after a permutation of indices. For example $v \otimes w$ is not symmetric (if $v$ and $w$ are linearly independent), since switching indices changes it to $w \otimes v$; but $v \otimes w + w \otimes v$ is symmetric, since switching indices changes it to $w \otimes v + v \otimes w$, which is the same thing. Symmetric tensors are a natural generalization of symmetric matrices. It turns out that symmetric tensors can be identified with (homogeneous) polynomials. (Symmetric matrices correspond to polynomials of degree $2$, i.e., quadratic forms.) This is evidence for the idea that "special" kinds of tensors can actually be natural and important objects (even if most "special" kinds of tensors don't quite rise to the same level of stardom as polynomials).

After symmetric tensors, a reasonable next candidate for a natural and important class of "special" tensors is the alternating tensors, those tensors which change sign upon interchanging any two indices. For example, $v \otimes w - w \otimes v$ is alternating, since switching indices changes it to $w \otimes v - v \otimes w$, which is the opposite of what we started with. These alternating tensors are a natural generalization of skew-symmetric matrices. And... they are essentially the same as the wedge products. This $v \otimes w - w \otimes v$ could be identified with $v \wedge w$.

The preceding paragraphs have been an attempt to justify the claim that the definition of wedge product is a very reasonable and natural idea, just a small variation on symmetric tensors. There are all kinds of generalizations. Especially in representation theory, where the symmetric and alternating tensors provide two representations of symmetric groups, and then you can use similar ideas to construct more representations. So perhaps this provides (3) a third motivation for wedge powers: in order to study representations of permutation groups.

(4) Let $V$ be any vector space over a field $k$. There is a natural correspondence between $p$-dimensional subspaces of $V$ and certain elements of $\bigwedge^p V$, as follows. For a $p$-dimensional subspace $W \subseteq V$ with basis $\{w_1,\dotsc,w_p\}$, $W$ corresponds to the wedge product $w_1 \wedge \dotsb \wedge w_p$. The definition of wedge product is cooked up just right so that a change of basis in $W$ leaves the element $w_1 \wedge \dotsb \wedge w_p$ unchanged, except for a scalar factor. Now $\bigwedge^p W$ is a natural vector space that, up to projectivizing, contains a subset that parametrizes the $p$-dimensional subspaces of $W$. It's easy to see that it's not any old subspace; it's an algebraic variety, i.e., defined by polynomial equations. Over $\mathbb{R}$ or $\mathbb{C}$ that means it's closed in the Euclidean topology and compact; over any field, we can use all the methods of algebraic geometry to study and exploit this variety. The variety is called the Grassmannian. (And by the way the defining equations can be described in terms of determinants.)

Elements of $\bigwedge^p V$ that don't correspond to subspaces, such as $v_1 \wedge w_1 + v_2 \wedge w_2$, can be interpreted as a "formal sum" of subspaces. (Perhaps that is a little naive. But my point is just that they are not relegated to meaninglessness; with a bit more care they can play a meaningful role for some questions.)

I'm sure there are many other motivations (...differential forms!...), but those four will be my list for now. Note that part of the point is to be able to deal with arbitrary fields... or even modules over commutative rings! If the only thing you will ever care about is real numbers, the matrix groups $O(n)$ and $SO(n)$, and orientations of real vector spaces, then I concede you might be less interested in this. But even then, I wonder if it can be entirely avoided? For example, starting from real numbers, it seems hard to entirely avoid complex numbers. And "orientations" of complex vector spaces are a little weird. The map $\mathbb{C}^2 \to \mathbb{C}^2$, $(v,w) \mapsto (w,v)$, is orientation preserving (as a map on $\mathbb{R}^4$). Perhaps there's some nice way to make it all work just like the real case, but I doubt it.

All of this story is told in many places. Any graduate algebra book explains wedge products, alternating tensors, and determinants. There are plenty of books about representation theory and symmetric groups. Keith Conrad's expository papers include one on exterior powers.

I get the impression that your primary interests are in Lie algebras and especially classical groups like $O(n)$ and $SO(n)$. Even if that's not exactly right, let's take a hypothetical person with that background and consider why they might be interested in determinants, wedge products, and Grassmannians.

For one thing, starting from classical cases like $O(n)$ and $SO(n)$ over $\mathbb{R}$, $U(n)$ over $\mathbb{C}$, $SL(n)$ over $\mathbb{R}$ or $\mathbb{C}$, etc., one might like to generalize to other fields, or positive characteristic. There's no decent notion of "orientation" then, so we have to recreate everything from scratch. Instead of signed volumes, we can use wedge products to do this.

Or one might be interested in representations of these groups. One idea, called "geometric representation theory", is to identify a variety (or manifold or whatever) that a group acts on, and then extract various representations from it. One idea is to let a matrix group $G$ act on a vector space $V$. Then $G$ acts on the set of $p$-dimensional subspaces of $V$. This set can naturally be identified with the set of points in a manifold called the Grassmannian. Since $G$ acts on the Grassmannian, it acts on all the cohomology groups of the Grassmannian. And it also acts on related things like the tangent bundle; secant varieties; secant varieties of the tangential variety; etc., etc.... and all the cohomology groups of all those things. That is a decent source of representations which people have studied and found to be interesting.

... Okay, I re-read your comments. You seem to be asking how one might be led to the construction of the exterior algebra. I don't know what to say. I think I'm going to refer to (2) above: there are various representations of symmetric groups, the trivial representation is clearly an important one (symmetric polynomials, symmetric functions) and the alternating representation is also clearly important; and that's the exterior algebra, right there.

I once asked one of my professors what was the motivation for trying to prove that resolutions of singularities exist; what was an application. He just looked puzzled and said that some things don't need applications or big elaborate motivations, they are just so natural that they justify themselves. I think basic, fundamental algebraic constructions like exterior algebra are like that.

Zach Teitler
  • 3,020
  • @user Okay, sorry, I took out the criticism part of my answer. I added a few paragraphs at the end. I hope it helps. If you want to hear more of this story, I think that lots of algebra textbooks give the development more thoroughly; Keith Conrad's notes are always a gold standard; and also perhaps books on differential forms might have some more appealing applications of exterior algebra. Good luck. – Zach Teitler Nov 16 '17 at 06:50
  • As you noted, yes, I'm precisely asking how one is led to the construction of the exterior algebra, in scope of constructing the determinant. Consider how we got to topological spaces from metric spaces, as we realized that in scope of closeness all a metric does is to generate open sets. Or how we got to measure theory from volume, as we realized that all we're doing is assigning scalars to subsets. I find instances like the exchange in your anecdote unfortunate. I'm upvoting to thank you for your rich answer, but I'm still looking for a deeper story for how to get to the determinant. – user Nov 16 '17 at 11:44
  • Okay, good luck. – Zach Teitler Nov 16 '17 at 15:50
2

I believe they first appeared in the works of Grassmann. The idea is that linear subspaces of $\Lambda^p(V)$ correspond to $p$ dimensional subspaces of $V$ thus allowing one to turn the set of $p$ dimensional subspaces into a variety.

  • Could you please elaborate on the idea of constructing a variety of $p$ dimensional subspaces? Starting with a vector space $V$, how do we get to $\Lambda^p(V)$ with this approach? How is the determinant to be interpreted in this context? – user Nov 15 '17 at 00:04
  • 2
    @user see Section 7 in https://kconrad.math.uconn.edu/blurbs/linmultialg/extmod.pdf, at least through Remark 7.7. The (nonobvious) point is that we can turn a geometric property like linear dependence into a vanishing condition for a multiplication between vectors (wedge product). This is analogous to the way the geometric property of orthogonality can be turned into a vanishing condition for the vector multiplication operation called the dot product (or inner product). Grassmann wanted to create an "algebra of subspaces" that allowed geometric properties to be calculated by algebra. – KCd Apr 23 '22 at 20:23