4

The reason I'm attempting to show the following result is so as to define the cross product in arbitrary dimensions.


Theorem: Given $v_1, \ldots , v_{n-1}\in \mathbb{R}^n$, there is a unique $v\in \mathbb{R}^n$ such that $$\det (u \ v_1 \ldots v_{n-1})=u\cdot v$$ for any $u\in \mathbb{R}^n$.


My attempt:

Case 1: the vectors $v_1, \ldots ,v_{n-1}$ are linearly dependent. Then the only possible choice for $v$ is clearly the zero vector.

Case 2: the vectors $v_1, \ldots ,v_{n-1}$ are linearly independent. Uniqueness can be shown by noting that

$$\det(v_i \ v_1 \ldots v_{n-1})=0=v_i\cdot v$$

for any $1\leq i \leq n-1$, so that $v$ is orthogonal to every $v_i$. In particular, $v$ is in the only direction perpendicular to the $(n-1)$-dimensional hyperplane spanned by the vectors $v_1, \ldots ,v_{n-1}$. Taking a unit vector $u$ in such direction we have

$$v=\mu u$$

for some $\mu \in \mathbb{R}$. Thus two vectors $v, v'$ with the hypothesized property give

$$v\cdot v = \det(v \ v_1 \ldots v_{n-1}) = v\cdot v' = \det(v' \ v_1 \ldots v_{n-1}) = v'\cdot v' =$$

so that $| v | = | v' | = |\mu|$. Thus $v=\pm v'$, yet if $v = -v'$ we have

$$\det(w \ v_1 \ldots v_{n-1}) = w\cdot v = -w\cdot v' = -\det(w \ v_1 \ldots v_{n-1})$$

for any vector $w\in \mathbb{R}^n$. So $v$ must equal $v'$ as wanted.


How can I show existence for the case in which $v_1, \ldots ,v_{n-1}$ are linearly independent?

balddraz
  • 7,558
Sam
  • 4,734
  • 1
    Just expand the determinant according to the row of $u$ to find a vector $v$ that works, namely the one consisting of the cofactors involving the $n-1$ known rows. If the $v_i$s are linearly independent at least one of the cofactors must be non-zero. – Jyrki Lahtonen Nov 17 '21 at 08:46
  • 1
    You are overthinking it. Let $f(u)=\det(u,v_1,\ldots,v_{n-1})$ and ${e_1,\ldots,e_n}$ be the standard basis. Then $f(u)=f(\sum_iu_ie_i)=\sum_iu_if(e_i)=u\cdot(f(e_1),\ldots,f(e_n))^T$. Hence $f(u)=u\cdot v$ for some vector $v$. This proves the existence of $v$. The proof of uniqueness should be easy. – user1551 Nov 17 '21 at 09:16

5 Answers5

5

The map $\varphi : u \mapsto \det(u, v_1, \dots, v_{n-1})$ is a linear form. Which is not the always vanishing one if $\{v_1, \dots ,v_{n-1}\}$ are linearly independent.

You then get the desired result by using the representation theorem of linear forms. Namely for any linear form $\psi$, it exists a unique $v$ such that $\psi(u) =v \cdot u$ for all $u \in \mathbb R^n$.

2

Your proof of uniqueness also gives you an idea of what $v$ should be:

It should be perpendicular to the hyperplane spanned by $v_1,\dots, v_{n-1}$.

It is actually possible to construct a vector with precisely the property above.

  1. The set $v_1,\dots, v_{n-1}$ can be completed to a basis for $\mathbb R^n$ with some vector $v'$. That is, there exists some vector such that $v_1,\dots, v_{n-1}, v'$ is a basis for $\mathbb R^n$.
  2. You can use the Gramm-Schmidt procedure to construct $v$ such that $v_1,\dots, v_{n-1}, v$ is still a basis for $\mathbb R^n$, and that $v$ is orthogonal to every other basis vector.

Alternatively, your entire exercise can be solved much more quickly if you already know the Riesz representation theorem.

5xum
  • 123,496
  • 6
  • 128
  • 204
2

You can also apply Laplace expansion to the first column of your matrix to get $v$ directly.

Dániel G.
  • 4,980
0

Representation theorem is simple in finite dimension vector space. Just notice that $\det(u,v_1,\cdots,v_{n-1})$ and $u\cdot v$ for any $v\in \mathbb{R}^n$ are both linear functions of $u\in \mathbb{R}^n.$ So it depends on the value of the basis.

Take $v$ to be a unit normal vector of $\operatorname{span}\{v_1,\cdots,v_{n-1}\}.$ Then $\{v_1,\cdots,v_{n-1},v\}$ are basis of $\mathbb{R}^n.$ We claim that for certain $\lambda,$ we have $$ \det(u,v_1,\cdots,v_{n-1})=u\cdot \lambda v=\lambda u\cdot v. $$ Both linear functions take zero on $v_i,$ and for $v,$ we try to find $\lambda$ to have $$ \det(v,v_1,\cdots,v_{n-1})=\lambda v\cdot v=\lambda. $$ So immediately we have $\lambda.$

Notice that you can take $\det(u,\cdots)$ to be any linear function. That's representation theorem in finite vector space equipped with inner product.

DreamAR
  • 986
0

Fix a basis. Let $v_i$ denote the determinant's value when $u=e_i$. By linearity, $u=\sum_{i=1}^nu_ie_i$ makes the determinant $\sum_iu_iv_i=u\cdot v$.

J.G.
  • 115,835