4

Background

I've been working on Exercise 1.1 in the book "An Introduction to Finite Tight Frames", which I paraphrase as follows: Let $u_1, u_2, u_3$ be any set of equally spaced unit vectors in $\mathbb{R}^2$ so that for a $2 \pi /3$ rotation matrix counterclockwise $R$ we have $u_2 = R u_1$ and $u_3 = R^2 u_1$. Let $f$ be any vector in $\mathbb{R}^2$. Show that: \begin{align*} f = \frac{2}{3} \left( u_1 \langle u_1, f \rangle + u_2 \langle u_2, f \rangle + u_3 \langle u_3, f \rangle \right) \end{align*} Basically, the intuition is that the sum of the projections onto three equally spaced unit vectors returns the original vector, scaled up by 3/2. The approach given in the solutions, which makes sense to me, is to pick some particular $\{u_1, u_2, u_3\}$, form $V = [u_1, u_2, u_3]$, and then show that for these particular $u_i$ vectors we have $V V^*= \frac{3}{2} I$. The result then follows by noting that any rotated version $TV$ (where $T$ is a rotation matrix) of these vectors also will satisfy the above equation, as $(TV)(TV)^* = TVV^*T^* = T \frac{3}{2}I T^* = \frac{3}{2}I$.

Do we need to pick coordinates?

However, I ended up picking coordinates to calculate $V V^*$ for a particular $\{u_1, u_2, u_3\}$. I was hoping there would be a coordinate-free way to solve this problem. Letting $u_2 = Ru_1$, $u_3 = R^2 u_1$ and $V = [u_1, u_2, u_3]$, can we show that $V V^* = \frac{3}{2} I$ in a coordinate-free way?

An attempt at solution

We can write $V V^*$ as: \begin{align} V V^* &= u_1 u_1^* + u_2 u_2^* + u_3 u_3^*\\ &= u_1 u_1^* + Ru_1 (Ru_1)^* + R^2 u_1 (R^2 u_1)^*\\ &= u_1 u_1^* + Ru_1 u_1^* R^{-1} + R^2 u_1 u_1 ^* (R^2)^{-1} \end{align} (Note that we have used the fact that $R$ is an orthogonal matrix). I wasn't really sure where to go from here. It might be worth noting that that if $\{I = R^0, R, R^2\}$ is the rotation group with three elements, and $\gamma_a$ denotes conjugation by $a$, then we have: \begin{align} V V^* &= \gamma_{R^0} u_1 u_1^* + \gamma_{R^1} u_1 u_1^* + \gamma_{R^2} u_1 u_1^*\\ &= (\gamma_{R^0} + \gamma_{R^1} + \gamma_{R^2}) u_1 u_1^* \\ &= (\gamma_R^0 + \gamma_R^1 + \gamma_R^2) (u_1 u_1^*) \end{align} where $u_1$ is some arbitrary unit vector. However, while this looks neat, I'm not sure how to simplify from here.

Any thoughts appreciated.

  • More generally, if $u_1+u_2+\cdots+u_{n+1}=0$ are unit vectors forming a regular $n$-simplex, then for any $f\in\mathbb R^n$, $$u_1(u_1\cdot f)+u_2(u_2\cdot f)+\cdots+u_{n+1}(u_{n+1}\cdot f)=\frac{n+1}{n}f,$$ which can be verified by evaluating it on the basis ${u_1,u_2,\cdots,u_n}$ and noting that $u_i\cdot u_j=-\frac1n$ for $i\neq j$. – mr_e_man Feb 14 '20 at 00:36
  • And $u_i\cdot u_j=-\frac1n$ because, for example, $$u_1\cdot(u_1+u_2+\cdots+u_{n+1})=u_1\cdot0=0;$$ the first term is $u_1\cdot u_1=1$, and, by symmetry, all the other terms $u_1\cdot u_2=u_1\cdot u_3=\cdots=u_1\cdot u_{n+1}$ are the same, so this is $$1+n(u_1\cdot u_2)=0.$$ – mr_e_man Feb 14 '20 at 00:45

2 Answers2

4

The "rotation for an angle $2\pi/3$ on $\mathbb R^2$" is usually defined using coordinates. If you want a coordinate-free proof for the mentioned statement, you must first clarify what a "rotation" on a $2$-dimensional real inner product space means. Depending on the definition, the proof will vary in difficulty. In the sequel, I suppose that $R$ is an orthogonal linear map (i.e. the inverse of $R$ is the adjoint of $R$ with respect to the given inner product) such that $R^2+R+I=0$.

Let us write $u,v,w$ for $u_1,u_2$ and $u_3$. The vector $u$ by definition is a unit vector. As $R$ is orthogonal, $v=Ru$ and $w=R^2u$ are also unit vectors. Since $R^2+R+I=0$, we have $u+v+w=0$. Thus \begin{aligned} \langle u,w\rangle+\langle v,w\rangle&=\langle u+v,w\rangle=\langle -w,w\rangle=-1,\\ \langle v,u\rangle+\langle w,u\rangle&=\langle v+w,u\rangle=\langle -u,u\rangle=-1,\\ \langle w,v\rangle+\langle u,v\rangle&=\langle w+u,v\rangle=\langle -v,v\rangle=-1.\\ \end{aligned} Therefore $\langle u,v\rangle=\langle v,w\rangle=\langle w,u\rangle=-\frac12$. Now let $g(f)=\frac23\left(\langle f,u\rangle u+\langle f,v\rangle v+\langle f,w\rangle w\right)$. Then $$ g(u)=\frac23\left(u-\frac12v-\frac12w\right) =\frac23\left(\frac32u-\frac{u+v+w}{2}\right)=u $$ and similarly, $g(v)=v$. However, since $x^2+x+1$ does not split over $\mathbb R$, the linear map $R$ has not any real eigenvalue. It follows that $au+bv=(aI+bR)u\ne0$ when $(a,b)\ne(0,0)$. Hence $\{u,v\}$ form a basis of $\mathbb R^2$ and $g(f)=f$ on this basis. In turn, we must have $g(f)=f$ on the whole vector space.

user1551
  • 139,064
2

In terms of complex numbers, the inner product corresponds to

$$\langle\vec a,\vec b\rangle\leftrightarrow\Re(\overline ab)=\frac{\overline ab+a\overline b}{2}$$

so your equation becomes

$$\frac32f\overset?=u_1\frac{\overline u_1f+u_1\overline f}{2}+u_2\frac{\overline u_2f+u_2\overline f}{2}+u_3\frac{\overline u_3f+u_3\overline f}{2}$$

$$=\frac12\Big(u_1\overline u_1+u_2\overline u_2+u_3\overline u_3\Big)f+\frac12\Big(u_1\!^2+u_2\!^2+u_3\!^2\Big)\overline f$$

$$=\frac12\Big(|u_1|^2+|u_2|^2+|u_3|^2\Big)f+\frac12\Big(1+R^2+R^4\Big)u_1\!^2\overline f$$

$$=\frac12\Big(1+1+1\Big)f+\frac12\Big(0\Big)\overline f$$

$$=\frac32f$$

since $R^2=R^{-1}$ is a $3$rd root of unity, and any $n$'th root of unity $\zeta_n\neq1$ satisfies

$$1+\zeta_n+\zeta_n\!^2+\zeta_n\!^3+\cdots+\zeta_n\!^{n-1}=0$$

which can be seen by factoring

$$1-\zeta_n\!^n=(1-\zeta_n)(1+\zeta_n+\zeta_n\!^2+\zeta_n\!^3+\cdots+\zeta_n\!^{n-1}).$$


In terms of geometric algebra, the inner product is

$$\langle a,b\rangle=\frac{ab+ba}{2}$$

so your equation becomes

$$\frac32f\overset?=u_1\frac{u_1f+fu_1}{2}+u_2\frac{u_2f+fu_2}{2}+u_3\frac{u_3f+fu_3}{2}$$

$$=\frac12\Big(u_1u_1+u_2u_2+u_3u_3\Big)f+\frac12\Big(u_1fu_1+u_2fu_2+u_3fu_3\Big)$$

$$=\frac12\Big(\lVert u_1\rVert^2+\lVert u_2\rVert^2+\lVert u_3\rVert^2\Big)f+\frac12\Big(u_1fu_1+u_1Rfu_1R+u_1R^2fu_1R^2\Big),$$

where I've used $u_2=u_1R$ and $u_3=u_1R^2$; and the product of vectors $fu_1$ is a complex number (a "scalar" plus a "bivector"), so it commutes with the complex number $R$:

$$=\frac12\Big(\lVert u_1\rVert^2+\lVert u_2\rVert^2+\lVert u_3\rVert^2\Big)f+\frac12\Big(u_1fu_1+u_1fu_1R^2+u_1fu_1R^4\Big)$$

$$=\frac12\Big(1+1+1\Big)f+\frac12u_1fu_1\Big(1+R^2+R^4\Big)$$

$$=\frac32f+\frac12u_1fu_1\Big(0\Big)$$

by the same reasoning as before.

mr_e_man
  • 5,364
  • How is this a coordinate-free proof? E.g. how do you define the product of two complex numbers without using their real and imaginary parts? – user1551 Jan 30 '20 at 20:52
  • user1551 makes a good point. How can we associate a complex number with each vector without making a selection of a coordinate system? That is, picking a real and imaginary part for each vector seems like picking a coordinate system. – David Egolf Jan 30 '20 at 21:02
  • @DavidEgolf - We don't need to identify vectors with complex numbers; see my answer. But eventually we have to refer to coordinates or a basis; otherwise we can't even define "dimension", while your question is about a $2$-dimensional space. – mr_e_man Jan 30 '20 at 23:20
  • @mr_e_man I think we can say that there exists a basis with two elements for the space (so the dimension of the space is defined), without actually committing to expressing vectors in a specific basis, so we can define the dimension without actually ever imposing coordinates. – David Egolf Jan 30 '20 at 23:42