-2

This answer proves that distance and rotation can be uniquely defined in the second dimension in such a way that it satisfies the first 5 intuitive properties stated in the question it's an answer to and it also satisfies properties 6 and 7. I first define an origin rotation in the second dimension to be a mapping that for some real numbers $x$ and $y$ where the distance from $(0, 0)$ to $(x, y)$ is 1 assigns to all points $(z, w)$ in $\mathbb{R}^2$, the point $(xz - yw, xw + yz)$. Then I define a rotation to be a transformation that can be gotten by applying a translation then an origin rotation than the inverse of that translation. I now have a similar question about whether origin rotation can be well defined in the third dimension. I define an elementry rotation in $\mathbb{R}^3$ to be any origin rotation in two of the three coordinates. My first question is

Is there a unique set of linear transformations in $\mathbb{R}^3$ that satisfies the following properties

  • That set with the operation of composition is a group

  • All elementry rotations belong to that set

  • For any member of that set, if it moves the point $(1, 0, 0)$ to $(x, 0, 0)$ for some nonnegative real number $x$, it is necessarily an elementary rotation

My second question is

If the answer to my first question is yes, then do all members of that set of operations also preserve the norm? I define the norm to be the sum of the squares of each coordinate.

I could ask a third question about whether the determinant of the matrix representation of any such transformation is always 1 because then it would show that any rotation of the unit cube preserves volume but I won't bother.

Timothy
  • 793
  • @Rahul I think that origin rotations do. It's just rotations in general that don't. Not all rotations are origin rotations. – Timothy Jan 09 '20 at 03:38
  • My question and answer had 1 score of 1 and now they have a score of 0. Some of you may be thinking, Duh. I think this is a real question. People are just assuming the answer to the first question is yes. I think it turns out that that statement is something that can be proven in a more rigorous way without assuming it. I myself like to have an understanding of different ways of thinking and understand how to prove it that way like I did so other people might feel the same way. – Timothy Jan 19 '20 at 19:55

1 Answers1

-1

The answer to both of those questions is yes and I will prove it. It can be proven using quaternions. The quaternions are an extension of the complex numbers with additional units $j$ and $k$. Addition is defined by adding each component seperately. Multiplication of the units is defined according to the following multiplication table where the row represents the first operand and the column represents the second operand. For example, since $i × j = k$, $k$ goes into the square in the row for $i$ and the column for $j$.

\begin{array}{|c|c|c|c|} \hline × & 1 & i & j & k \\ \hline 1 & 1 & i & j & k\\ \hline i & i & -1 & k & -j\\ \hline j & j & -k & -1 & i\\ \hline k & k & j & -i & -1\\ \hline \end{array}

It's not that hard to see that the table has a very nice and simple pattern.

The product of the rest of the pairs of quaternions is defined to satisfy right distributivity and left distributivity over addition. It can be shown that quaternion multiplication is associative. I believe I never got taught specifically what operations matrices actually represent and just got taught how to multiply matrices. I will make up my own definition on what operations they represent and if that's not actually the one that's used, it's the job of other people who have a job of representing transformations with matrices to use the conventional representation and not mine. Take the transformation that for some real numbers $a$, $b$, $c$, $d$ moves all points in $\mathbb{R}^2$, $(x, y)$ to the point $(ax + by, cx + dy)$. I define its matrix representation to be ${\begin{bmatrix} a & b \\ c & d \end{bmatrix}}$. You get the idea. It can now be shown that the product of any two matrices in any order that can be multiplied in that order represents the composition of the transformations they represent in that order. Since composition is always associative, also matrix multiplication is associative. It can also be shown that${\begin{bmatrix} a & b \\ c & d \end{bmatrix}}$${\begin{bmatrix} x \\ y \end{bmatrix}}$ = ${\begin{bmatrix} ax + by \\ cx + dy \end{bmatrix}}$. My Linear Algebra textbook showed some examples of left multiplying a column matrix by a square matrix. I'm guessing they expect us to figure out from that that matrices were defined to represent linear transformations exactly the way I defined them to.

I'll define the conjugate of any quaternion $x$ to be the one gotten by negating all the components but the real component and denote it $\overline{x}$. I'll define the norm of any quaternion to be the product of that quaternion and its conjugate. Let's denote the norm of any quaternion $x$, $\langle x \rangle$. It can be shown that the norm is the sum of the squares of the components. It can be shown that the norm of a product is the product of the norms and that the conjugate of a product is the product of the conjugates in the opposite order as follows. For any quaternions, $x$ and $y$, $xy\overline{y}\text{ }\overline{x} = x\langle y \rangle\overline{x} = x\overline{x}\langle y \rangle = \langle x \rangle\langle y \rangle = \langle xy \rangle$. Now let's represent any quaternion $x$ with the operation that left multiplies by $x$ then right multiples by its conjugate. It's straight forward to show that the representation of a product is the composition of the representations in that order as follows. For any quaternions $x$, $y$ and $z$, $xyz\overline{y}\text{ }\overline{x} = xyz\overline{xy}$.

Now take the matrix representation of the representation of any quaternion. Then it can be shown that the entry of the top left corner is the first component of the norm of that quaternion and all entries that are in the first row but not the first column or the first column but not the first row are 0. Now if you delete the first column and the first row, you get a matrix representation for a 3-dimensional linear transformation. Now restrict yourself to the quaternions whose norm is 1 and do the same process and take the set of all possible resulting linear transformations on $\mathbb{R}^3$. You indeed get a set of transformations satisfying the three properties stated in the question and it's unique. Each of those transformations also preserves the norm of $\mathbb{R}^3$.

It turns out that if you take all operations on the quaternions that can be gotten by applying a left multiplication by any quaternion with norm 1 then applying a right multiplication by any quaternion with norm 1, you even get a set of operations satisfying the same three properties in the fourth dimension except that the third one is replaced with the property that it is necessarily a 3-dimensional rotation in 3 of the coordinates.

I once stumbled on a web page that said some of the stuff I said here. However, I probably would have thought of this independently anyway.

Timothy
  • 793