3

"Let $u_1$, $u_2$ be to vectors in $\mathbb{R}^4$ $$u_1=(1,0,1,1) \text{ and } u_2=(1,1,0,3)$$

Provide a real vector which is orthogonal to both $u_1$ and $u_2$

So, I kind of guessed a vector $u_3=(1,-1,-1,0)$ which must be orthogonal to both since $$u_1 \cdot u_3 = 0 \text{ and } u_2 \cdot u_3=0$$

My question is, how should it be done if it can't immediately be guessed? In $\mathbb{R}^3$ one could just take the cross product of the two vectors, but that's not defined for any other vector spaces

Alex5207
  • 605
  • Pick a random vector. It almost surely won't be orthogonal, and almost surely will be linearly independent. Find its projection onto the previous vectors, then subtract out these projections to obtain an orthogonal vector. This is the Gram-Schmidt process. – anon Dec 07 '17 at 16:13
  • Nonzero binary cross products exist only in three and seven dimensions. See here: https://en.wikipedia.org/wiki/Seven-dimensional_cross_product#Generalizations – Hector Blandin Dec 07 '17 at 16:27

3 Answers3

1

You can use Gauss-Jordan method to solve the linear system: $$\langle (x,y,z,t),u_1\rangle=0$$ $$\langle (x,y,z,t),u_2\rangle=0$$ that is: $$ x+z+t=0 $$ $$ x+y+3t=0$$ so the matrix of your system of linear equations is: $$ A=\left[ \begin{array}{cccc} 1&0&1&1\\ 1&1&0&3 \end{array} \right] $$ the rref of $A$ is $$ \mathrm{rref}(A) = \left[ \begin{array}{cccc} 1&0&1&1\\ 0&1&-1&2 \end{array} \right] $$ so, you have $$x=-z-t$$ $$y=z-2t$$ with $z,t$ being any real values. then all vectors $\vec{v}=(x,y,z,t)$ that are ortogonal to both $u_1$ and $u_2$ are of the form $$\vec{v}=(-z-t,z-2t,z,t).$$

See the link bellow to clarify the general cross product:

Cross product in $\mathbb R^n$

  • 1
    Great - I see this works. I guess on parametric form it's equal to $$\begin{bmatrix} x \ y \ z \ t \end{bmatrix} = s \cdot \begin{bmatrix} -1 \ 1 \ 1 \ 0 \end{bmatrix} + t \cdot \begin{bmatrix} -1 \ -2 \ 0 \1 \end{bmatrix}$$ where $(s,t)=(z,t)$ - Right? However, could you explain exactly why it works to put the vectors in a matrix and rref? I'm not really sure why it provides the answer.. – Alex5207 Dec 07 '17 at 16:36
  • Yes, it is true because $\vec{v}=(x,y,z,t)$ will be a solution to both conditions $\langle \vec{v},u_1\rangle=0$ and $\langle \vec{v},u_2\rangle=0$. Recall that these two conditions are just two linear equations that you can solve using Linear System techniques as Guass Jordan. – Hector Blandin Dec 07 '17 at 16:40
  • You are the man! Thanks very much – Alex5207 Dec 07 '17 at 16:54
  • 2
    @Alex5207 More fundamentally, the null space of a matrix is the orthogonal complement of its row space. Row-reduction doesn’t change the row space, and once you have the rref, you can read a basis for the null space from it. – amd Dec 07 '17 at 20:50
  • Yes, exactly! that's a good explanation. – Hector Blandin Dec 07 '17 at 20:52
1

Take a non-null vector $u_3$. If it is orthogonal to the other two, you're done. Otherwise, compute the $4$-dimensional cross-product of $u_1$, $u_2$, and $u_3$ described here.

0

Since you have only two vectors, you can work in $\mathbb R^3$. Let $u_3 = (a, b, c, 0)$, so that $u_1 \cdot u_3$ and $u_2 \cdot u_3$ only depend on the first three components of $u_1$ and $u_2$. So, call $v_i$ the vector of the first three components of $u_i$, and compute $v_3 = v_1 \times v_2 = (a, b, c)$ (this wouldn't work if either $v_1$ or $v_2$ was zero).

Specifically,

$$u_3 = \begin{pmatrix} \begin{vmatrix}0 & 1 \\ 1 & 0\end{vmatrix}, -\begin{vmatrix}1 & 1 \\ 1 & 0\end{vmatrix}, \begin{vmatrix}1 & 0 \\ 1 & 1\end{vmatrix}, 0 \end{pmatrix} = (-1, 1, 1, 0)$$

Which only differs in sign from your solution.

lisyarus
  • 15,517