For an orthogonal set of vectors {$\vec{v1}$,$\vec{v2}$,$\vec{v3}$} in $\mathbb{R}^{4}$, show that there is a vector $\vec{v4}$ so that {$\vec{v1}$,$\vec{v2}$,$\vec{v3}$,$\vec{v4}$} forms an orthogonal basis for $\mathbb{R}^{4}$.
Thanks!
For an orthogonal set of vectors {$\vec{v1}$,$\vec{v2}$,$\vec{v3}$} in $\mathbb{R}^{4}$, show that there is a vector $\vec{v4}$ so that {$\vec{v1}$,$\vec{v2}$,$\vec{v3}$,$\vec{v4}$} forms an orthogonal basis for $\mathbb{R}^{4}$.
Thanks!
It's not enough that $v_1,v_2,v_3$ be orthogonal; they must also be independent, in order to complete the set to an orthogonal basis. In other words, none must be zero. Zero vectors are considered orthogonal, but they are not independent and so are not included in any basis.
Anyway, assuming the vectors $v_1,v_2,v_3$ are orthogonal and independent, if you just want to prove that an orthogonal $v_4$ exists, you can invoke the general fact that every independent set of vectors can be completed to a basis, to find a fourth independent vector $u_4$ that completes the basis, and invoke Gram-Schmidt or more simply, just take the orthogonal projection of $u_4$ by subtracting the projection into the span of $v_1,v_2,v_3$.
Explicitly, if we set $$v_4=u_4 - \frac{u_4\cdot v_1}{v_1\cdot v_1}v_1 - \frac{u_4\cdot v_2}{v_2\cdot v_2}v_2 - \frac{u_4\cdot v_3}{v_3\cdot v_3}v_3,$$ then we have a $v_4$ that is orthogonal to the span of $v_1,v_2,v_3.$ Here we are simply subtracting off the components of $u_4$ in the directions of the orthogonal vectors $v_1,v_2,v_3.$
That's great if you just need a proof that such a vector exists. But what if you want an algorithm to compute the vector? If we were in $\mathbb{R}^3$, we would just use the vector cross product. $v_1\times v_2$ is a vector guaranteed to be orthogonal to both $v_1$ and $v_2$, so it will complete an orthogonal basis. Well there is also an operation in linear algebra to generalize the notion of a vector cross product to higher dimensions, which solves problems like these: We will take the Hodge star of $v_1\wedge v_2\wedge v_3.$ See also this answer.
The Hodge star is somewhat analogous to the vector cross product in $\mathbb{R}^3,$ except it's a unary operator on multivectors, instead of a binary operator on vectors. The vector cross product is how you solve the same question in $\mathbb{R}^3$, and this is the correct generalization to higher dimensions. But let's not get bogged down in the theoretical aspects. Let's just recall how the Hodge star operates on basis vectors. It picks exactly those independent vectors needed to complete the volume form:
$$ *(e_1\wedge e_2\wedge e_3) = e_4\\ *(e_1\wedge e_2\wedge e_4) = -e_3\\ *(e_1\wedge e_3\wedge e_4) = e_2\\ *(e_2\wedge e_3\wedge e_4) = -e_1. $$
Now let's compute the wedge product $v_1\wedge v_2\wedge v_3.$ A wedge product is the antisymmetrization of the components, analogous to the determinant. The rules are, the wedge of a vector with itself is 0, and wedges anticommute: $u\wedge v = -v\wedge u.$ So if $v_i=a_ie_1+b_ie_2+c_ie_3+d_ie_4,$ then we have
$$ v_1\wedge v_2\wedge v_3 =\\ (a_1e_1+b_1e_2+c_1e_3+d_1e_4)\wedge (a_2e_1+b_2e_2+c_2e_3+d_2e_4)\wedge (a_3e_1+b_3e_2+c_3e_3+d_3e_4)\\ =(a_1b_2c_3 + a_2b_3c_1 + a_3b_1c_2 - a_2b_1c_3 - a_1b_3c_2 - a_3b_2c_1)e_1\wedge e_2\wedge e_3\\ +(a_1b_2d_3 + a_2b_3d_1 + a_3b_1d_2 - a_2b_1d_3 - a_1b_3d_2 - a_3b_2d_1)e_1\wedge e_2\wedge e_4\\ +(a_1c_2d_3 + a_2c_3d_1 + a_3c_1d_2 - a_2c_1d_3 - a_1c_3d_2 - a_3c_2d_1)e_1\wedge e_3\wedge e_4\\ +(b_1c_2d_3 + b_2c_3d_1 + b_3c_1d_2 - b_2c_1d_3 - b_1c_3d_2 - b_3c_2d_1)e_2\wedge e_3\wedge e_4.\\ $$
This computation looks ridiculously long, but it would probably not look as bad with actual numbers. Especially if some of the components are zero, there may be a lot fewer operations necessary than this formula would indicate.
Note that if you're more familiar with matrix algebra than exterior algebra, you can also understand the above expression as the $3\times 3$ minors of the $4\times 3$ matrix whose columns are $v_1,v_2,v_3.$
Anyway, we may now give the Hodge dual of $v_1\wedge v_2\wedge v_3$ in terms of the components:
$$ *(v_1\wedge v_2\wedge v_3)=\\ =(a_1b_2c_3 + a_2b_3c_1 + a_3b_1c_2 - a_2b_1c_3 - a_1b_3c_2 - a_3b_2c_1)*(e_1\wedge e_2\wedge e_3)\\ +(a_1b_2d_3 + a_2b_3d_1 + a_3b_1d_2 - a_2b_1d_3 - a_1b_3d_2 - a_3b_2d_1)*(e_1\wedge e_2\wedge e_4)\\ +(a_1c_2d_3 + a_2c_3d_1 + a_3c_1d_2 - a_2c_1d_3 - a_1c_3d_2 - a_3c_2d_1)*(e_1\wedge e_3\wedge e_4)\\ +(b_1c_2d_3 + b_2c_3d_1 + b_3c_1d_2 - b_2c_1d_3 - b_1c_3d_2 - b_3c_2d_1)*(e_2\wedge e_3\wedge e_4)\\ =(a_1b_2c_3 + a_2b_3c_1 + a_3b_1c_2 - a_2b_1c_3 - a_1b_3c_2 - a_3b_2c_1)e_4\\ -(a_1b_2d_3 + a_2b_3d_1 + a_3b_1d_2 - a_2b_1d_3 - a_1b_3d_2 - a_3b_2d_1)e_3\\ +(a_1c_2d_3 + a_2c_3d_1 + a_3c_1d_2 - a_2c_1d_3 - a_1c_3d_2 - a_3c_2d_1)e_2\\ -(b_1c_2d_3 + b_2c_3d_1 + b_3c_1d_2 - b_2c_1d_3 - b_1c_3d_2 - b_3c_2d_1)e_1.\\ $$
There is a nice mnemonic for remembering this formula. Just as the vector cross product in $\mathbb{R}^3$ can be written somewhat informally as $$ (a_1e_1+b_1e_2+c_1e_3)\times(a_2e_1+b_2e_2+c_2e_3)=\det \begin{pmatrix} e_1 & e_2 & e_3\\ a_1 & b_1 & c_1\\ a_2 & b_2 & c_2\\ \end{pmatrix}, $$ here we similarly have in $\mathbb{R}^4$,
$$ *(v_1\wedge v_2\wedge v_3)=\det \begin{pmatrix} e_1 & e_2 & e_3 & e_4\\ a_1 & b_1 & c_1 & d_1\\ a_2 & b_2 & c_2 & d_2\\ a_3 & b_3 & c_3 & d_3\\ \end{pmatrix}, $$
As a sanity check, let's compute $$ v_1\cdot*(v_1\wedge v_2\wedge v_3)=\\ (a_1e_1+b_1e_2+c_1e_3+d_1e_4)\cdot*(v_1\wedge v_2\wedge v_3)=\\ -a_1(b_1c_2d_3 + b_2c_3d_1 + b_3c_1d_2 - b_2c_1d_3 - b_1c_3d_2 - b_3c_2d_1)\\ +b_1(a_1c_2d_3 + a_2c_3d_1 + a_3c_1d_2 - a_2c_1d_3 - a_1c_3d_2 - a_3c_2d_1)\\ -c_1(a_1b_2d_3 + a_2b_3d_1 + a_3b_1d_2 - a_2b_1d_3 - a_1b_3d_2 - a_3b_2d_1)\\ +d_1(a_1b_2c_3 + a_2b_3c_1 + a_3b_1c_2 - a_2b_1c_3 - a_1b_3c_2 - a_3b_2c_1), $$
which hopefully a little staring will convince you that every term cancels. So they are orthogonal.
Better yet, by analogy with the triple product in $\mathbb{R}^3$, we have a formula like
$$(a_4e_1+b_4e_2+c_4e_3+d_4e_4)\cdot*(v_1\wedge v_2\wedge v_3)=\det \begin{pmatrix} a_4 & a_4 & a_4 & a_4\\ a_1 & b_1 & c_1 & d_1\\ a_2 & b_2 & c_2 & d_2\\ a_3 & b_3 & c_3 & d_3\\ \end{pmatrix}. $$
This makes it much more obvious that $v_1\cdot*(v_1\wedge v_2\wedge v_3)$ vanishes.
If you just want to prove that such a vector exists, then you can just invoke the existence of a basis, and the ability to extend any independent set to a basis. Extend your given three vectors $v_1,v_2,v_3$ to a basis (assuming they're independent, and not just orthogonal), take the fourth vector $u_4$ and subtract off its orthogonal projection, to get an orthogonal set:
$$v_4=u_4 - \frac{u_4\cdot v_1}{v_1\cdot v_1}v_1 - \frac{u_4\cdot v_2}{v_2\cdot v_2}v_2 - \frac{u_4\cdot v_3}{v_3\cdot v_3}v_3.$$
But this doesn't give a formula for actually finding that fourth vector. In the analogous problem in $\mathbb{R}^3,$ there is a well-known operation for finding a third orthogonal vector: given two vectors $v_1=a_1e_1+b_1e_2+c_1e_3$ and $v_2=a_2e_1+b_2e_2+c_2e_3$, we may find a third vector orthogonal to both using the vector cross product
$$ v_3=v_1\times v_2=(a_1e_1+b_1e_2+c_1e_3)\times(a_2e_1+b_2e_2+c_2e_3)=\det \begin{pmatrix} e_1 & e_2 & e_3\\ a_1 & b_1 & c_1\\ a_2 & b_2 & c_2\\ \end{pmatrix}\\ =(b_1c_2-b_2c_1)e_1+(a_2c_1-a_1c_2)e_2+(a_1b_2-a_2b_1)e_3. $$
If $v_1$ and $v_2$ are independent, then $v_3$ will be nonzero.
Although there is no vector cross product as a binary product in $\mathbb{R}^n$ for higher $n$, this formula still does generalize. For example in $\mathbb{R}^4$ if we have $v_1=a_1e_1+b_1e_2+c_1e_3+d_1e_4, v_2=a_2e_1+b_2e_2+c_2e_3+d_2e_4, v_3=a_3e_1+b_3e_2+c_3e_3+d_3e_4,$ then we can find a fourth vector $v_4$ which is orthogonal to all three of $v_1,v_2,v_3,$ given by
$$ v_4=\det\begin{pmatrix} e_1 & e_2 & e_3 & e_4\\ a_1 & b_1 & c_1 & d_1\\ a_2 & b_2 & c_2 & d_2\\ a_3 & b_3 & c_3 & d_3\\ \end{pmatrix}\\ =\det\begin{pmatrix}b_1 & c_1 & d_1\\b_2 & c_2 & d_2\\b_3 & c_3 & d_3\end{pmatrix}e_1 -\det\begin{pmatrix}a_1 & c_1 & d_1\\a_2 & c_2 & d_2\\a_3 & c_3 & d_3\end{pmatrix}e_2 +\det\begin{pmatrix}a_1 & b_1 & d_1\\a_2 & b_2 & d_2\\a_3 & b_3 & d_3\end{pmatrix}e_3 -\det\begin{pmatrix}a_1 & b_1 & c_1\\a_2 & b_2 & c_2\\a_3 & b_3 & c_3\end{pmatrix}e_4\\ = (b_1c_2d_3 + b_2c_3d_1 + b_3c_1d_2 - b_2c_1d_3 - b_1c_3d_2 - b_3c_2d_1)e_1\\ -(a_1c_2d_3 + a_2c_3d_1 + a_3c_1d_2 - a_2c_1d_3 - a_1c_3d_2 - a_3c_2d_1)e_2\\ +(a_1b_2d_3 + a_2b_3d_1 + a_3b_1d_2 - a_2b_1d_3 - a_1b_3d_2 - a_3b_2d_1)e_3\\ -(a_1b_2c_3 + a_2b_3c_1 + a_3b_1c_2 - a_2b_1c_3 - a_1b_3c_2 - a_3b_2c_1)e_4. $$
And as before, this will give a $v_4$ that is orthogonal to all three of $v_1,v_2,v_3$. And if they are independent, then $v_4$ will be nonzero.