1

Suppose I have n-1 linearly independent vectors in n dimensional space, and I wish to find a vector perpendicular to all of these. How would I go about doing this? When working in three dimensions, I can just find the cross product of the two given vectors. In 2 dimensions, I would solve the algebraic expression for the dot product. These two methods seem entirely different and so I am not sure if they can somehow be generalised. I could generalise the approach for the 2D case by writing a series of n-1 simultaneous equations which are the dot products of an unknown vector with the given vectors, and u could write this in matrix form, but then I don't know what to do with this...

Meep
  • 3,167
  • I apologise I realise I am just asking for how to find the null space of a matrix, which has been answered here https://www.google.co.uk/url?sa=t&source=web&rct=j&url=https://math.stackexchange.com/questions/88301/finding-the-basis-of-a-null-space&ved=0ahUKEwj0yu338NHVAhWmJ8AKHVdrDY4QFggyMAI&usg=AFQjCNFtYQVqP7iLvxeDtetfJKvyWf1tSw I would be happy for the question to be closed. – Meep Aug 12 '17 at 14:26

1 Answers1

1

As you've noted, the problem can be reduced to finding the null space of a matrix. But perhaps to others it is not obvious why this is so.

Let $v_1,\ldots,v_{n-1},x\in\mathbb{R}^n$ be column vectors. Consider: $$ A = \begin{bmatrix}v_1^T\\\vdots\\v_{n-1}^T\end{bmatrix} \;\;\;\implies\;\;\; Ax=\vec{0} =\begin{bmatrix}v_1^T x\\\vdots\\v_{n-1}^T x\end{bmatrix} =\begin{bmatrix}v_1\cdot x\\\vdots\\v_{n-1}\cdot x\end{bmatrix} $$ So, finding the null-space of $ A $ is equivalent to enforcing $v_i\cdot x=0$, meaning $x$ is orthogonal to the rest of the set.

It may also be worth mentioning the Gram-Schmidt procedure, which is a little more general.

Notice that this gives you an easy way to construct an orthonormal vector $v_n$, which is orthogonal to a set of orthonormal vectors $v_1,\ldots,v_{n-1}$. Just choose a random vector $r\in\mathbb{R}^n$, normalize it, and take: $$ v_n = r - \sum_{k=1}^{n-1} \frac{v_k\cdot r}{v_k\cdot v_k}v_k $$ where, if $r$ or $v_n$ turns out to be linearly dependent on one of the $v_i$, just choose a different $r$. (The probability of such an event is extraordinarily low; theoretically, zero in fact.)

user3658307
  • 10,433