1

I have a $k$ number of $n$-dimensional vectors written with respect to two rotated frames:

$X= \{\vec{x}_1,\vec{x}_2,...,\vec{x}_k\}$

and the same rotated vectors:

$X'= \{\vec{x'}_1,\vec{x'}_2,...,\vec{x'}_k\}$

How do I find the rotation matrix that transform all these vectors? From here I know how to find the rotation matrix from one single vector:

$\vec{x} = R.\vec{x}'$

but then how do I find the full rotation matrix that rotates the full $X'$ set at the same time?

EDIT2: From the answers bellow I now proceed as follows:

Build a $Y$ matrix made from the norm vectors $\hat{x_i}$ and a second $Y'$ from the norm vectors $\hat{x_i}'$ (both in the same order), with $i=1,...,n$ The rotation matrix is be written as:

$R = Y (Y')^{-1}$

I have tested the method with the following simple example:

Lets suppose I have two sets of vectors, $u = \{\vec{u}_1,\vec{u}_2,\vec{u}_3,...,\vec{u}_k\}$ and $v=\{\vec{v}_1,\vec{v}_2,\vec{v}_3,...\vec{v}_k\}$, where I know a priori that $v$ was built from the a rotation of $u$ (I used this to simulate $\vec{v}_i$).

Since we are in 3D I just need three vectors form the full set. Lets suppose that $u$ set had the following vectors:

$\vec{u}_1 = [3,5,2]$

$\vec{u}_2 = [1,2,8]$

$\vec{u}_3 = [4,3,10]$

and the respective associated $v_i$

$\vec{v}_1 = [4.20, 3.63, 2.69]$

$\vec{v}_2 = [6.78, -2.76,3.93 ]$

$\vec{v}_3 = [10.39,-1.82,3.71]$

Then, the transformation matrix R is constructed by::

$Y = (\hat{u_1},\hat{u_2},\hat{u_3})$

$Y' = (\hat{v_1},\hat{v_2},\hat{v_3})$

R = $Y(Y')^{-1}$

Now, any of remaining $k-n$ vectors can be mapped trough:

$\vec{u}_i = R \vec{v}_i$

Miguel
  • 151
  • This is just a guess, but for the case $k=n$ I have a feeling that doing Gram-Schmidt on each set to turn them into orthogonal matrices and then computing $Q (Q')^{-1}$ would work. Possibly the general case can be done by extending the orthogonalized partial matrices to full orthogonal matrices arbitrarily? – Erick Wong Apr 22 '16 at 22:52
  • Unless you’re in $\mathbb R^2$, it’s misleading to talk about “the” rotation to align two vectors because there are in general an infinite number of them. – amd Apr 22 '16 at 23:40
  • If I understand correctly, you already know which vectors in $X$ are paired with the ones in $X'$. I can think of a couple of ways to proceed. One is iterative—find a rotation for one pair, transform the others, map the next pair while keeping the first pair fixed, and so one. The other way is to choose a linearly-independent subset of $X$, write down the images of these vectors as the columns of the matrix, then change basis back to the standard basis. In both cases you might have an underdetermined system, so you’ll have to fill out the rest of the matrix so that it has unit determinant. – amd Apr 22 '16 at 23:43
  • @amd Unit determinant $\ne$ orthogonal? – Erick Wong Apr 22 '16 at 23:47
  • @ErickWong The OP is looking for a rotation, which will always have a determinant equal to one. There might be more constraints on the free dimensions of the transformation as well, but I haven’t thought very hard about that. Also, we’re given that the two sets of vectors are related by a rotation, so there’s a good chance that simply mapping them onto each other will get us one. – amd Apr 22 '16 at 23:49
  • @amd Ah, that makes sense. In general, $O(n)$ is a Lie group of dimension $n(n-1)/2$ while $SL(n)$ has dimension $n^2-1$. So yes, there should be a substantial number of additional constraints, at least for small values of $k$ relative to $n$. Your point about the inputs being themselves constrained by a rotation is interesting, I will think about that more :). – Erick Wong Apr 22 '16 at 23:56
  • @amd Yes you understood correctly. I know which vectors are paired together. And I have k = n vectors or even more. I'm trying to find the general R rotation matrix, so that when I have a new vector I just have rotate it with this saved R. – Miguel Apr 23 '16 at 00:12

2 Answers2

2

If $X$ has a subset of $n$ linearly independent vectors, then you can completely recover the transformation. Let $B$ be the matrix with these vectors as columns, and $B'$ be the matrix of their images (in the same order, of course). We know that $B'=RB$ for some rotation matrix $R$, therefore $R=B'B^{-1}$. All other vectors in $X$ are linear combinations of these chosen vectors, so $R$ will clearly map them correctly as well.

If you have fewer than $n$ linearly independent vectors, then it might not be possible to recover the entire original transformation, but you may not need to. If all of the vectors to be mapped are in the span of $X$, then you can arbitrarily extend the two sets of vectors to full bases for the space and proceed as above. You most likely won’t end up with a rotation matrix, but the resulting transformation will agree with the original on this subspace. Otherwise, you’ll have to use other information about the original rotation to fill in the missing pieces.

Update: Misread the question—looks like you wanted the transformation which maps $X'$ back to $X$. That’s obviously $BB'^{-1}$ instead.

amd
  • 53,693
1

I think the following very minor variant of Gram-Schmidt should work reasonably well:

  1. Remove the largest-magnitude vector in $X$, say $x_i$. Remove the corresponding vector $x'_i$ from $X'$.
  2. Normalize $x_i$ to unit length and add it to the set $Y$. Do the same to $x'_i$ and $Y'$.
  3. Subtract the projection onto $x_i$ of of every (remaining) vector in $X$, so that they are all orthogonal to $x_i$. Do the same for $x'_i$ and $X'$.
  4. Repeat 1–3 until $Y$ and $Y'$ each have a full set of $n$ vectors.
  5. Write down $Y$ and $Y'$ as matrices of column vectors (naturally, in the same order as you added them). Now $R = Y(Y')^{-1}$ is a rotation matrix which maps $Y'$ to $Y$, and by rigidity it should also map the rest of $X'$ to $X$.

In Step 1, we don't really need to choose the largest-magnitude vector, just a non-zero vector. I figured choosing the longest vector would be less sensitive to rounding errors.

Erick Wong
  • 25,198
  • 3
  • 37
  • 91
  • Ok, thanks @ErickWong! I going to try your solution and then report. – Miguel Apr 23 '16 at 17:41
  • please have a look into my edit. I tried your receipe but it wasn't giving the correct results. I then tried to use the norm vectors only, and it worked. Any ideia why? – Miguel Apr 24 '16 at 13:07
  • @Miguel I can't understand what you are describing in your edit, can you clarify? What did you do before and after? Why did you link to that Python code? It doesn't seem at all relevant to this procedure: it's only for $\mathbb R^3$, and we aren't trying to rotate around any particular axis!. – Erick Wong Apr 24 '16 at 15:47
  • Ok, I'm going to clarify. I used 3D just as an example to test the method. For this simple case (3D) I can easily simulate a set rotated vectors, and then check if the process to find the transformation matrix is working correctly. – Miguel Apr 24 '16 at 19:43
  • Seems like a lot of work when you can instead invert the matrix of source vectors directly and multiply that by the matrix of images. Both methods involve inverting a matrix, unfortunately, so both suffer from the associated numerical instability. For instance, in the OP’s test case, the determinant and real eigenvalue of the derived rotation matrix aren’t quite equal to 1. On the other hand, an orthonormal basis might be a better starting point if there aren’t enough linearly independent vectors in $X$ to fully recover the transformation. – amd Apr 24 '16 at 20:01
  • @amd True I'm not convinced that this greedy method picks out a robustly independent set of vectors, moreso than alternatives such as building a pseudoinverse using the entire data set. My main motivation is that this method at least gives some rotation matrix even if the data is noisy. Note that it doesn't incur the specific instability of inverting a matrix since $(Y')^{-1}$ is just the transpose of $Y'$. – Erick Wong Apr 25 '16 at 17:51
  • Good point about the transpose. – amd Apr 25 '16 at 18:00