0

Show (in cartesian coordinates) that

$\vec{r} \times (\vec{\omega}\times \vec{r})=r^2\vec{\omega}-(\vec{\omega}\cdot\vec{r})\vec{r} $

I am not really sure how to calculate this. Do I just assume that it's a 3D problem so each vector just has 3 components? What components does the angular-velocity vector have? Is it just $\omega_1, \omega_2, \omega_3$?

Thanks in advance

John Hughes
  • 93,729
qmd
  • 4,275

2 Answers2

1

Yes, you assume it has three components. And do the same for $r$, and then compute away.

Alternatively, you can draw a few pictures, and realize that $\omega', (\omega \times r)',$ and $(\omega \times r)' \times r'$ is an orthonormal basis in which this statement becomes particularly simple, where primes denote unit vectors. (This does't handle the case $r = 0$, but that one's easy.)

This is also a fundamentally 3D statement, since cross product of two vectors is only defined there.

A slightly different proof: regard the left and right-hand sides as functions of $\omega$. Clearly both are linear. So we may simply consider the case where $\omega$ is a unit vector. A similar observation (both sides quadratic in $r$) reduces us to the case where $r$ is a unit vector. Fix $r$, and consider three independent possibilities for $\omega$:

  1. $\omega = r$; in this case both sides are zero, and equality holds.

  2. $\omega$ is a unit vector $v$ perpendicular to $r$. In this case the rightmost term is zero, and a simple geometric argument shows that the left side and the left term of the right side agree.

  3. $\omega$ is $v \times r$; then $\omega$ is again perpendicular to $r$, so the same argument holds.

Since both sides are linear functions of $\omega$ and they agree on a basis, they agree everywhere.

Not a coordinate in sight! :) $$ \newcommand{\e}{\mathbf e} $$ A final proof: Cross products and dot products are both invariant under rotations, so if the equality holds for $\omega_0$ and $r_0$, it also holds for $\omega$ and $r$, where $\omega = R \omega_0$ and $r = R r_0$, and $R$ is some rotation.

Now to prove the theorem, let $R$ be a rotation that takes $e_3$ to $r$ and some vector $v$ in the $\e_1\e_3$ plane to $\omega$, say, $v = x\e_1 + z\e_3$. We'll show the theorem's true with $\e_3$ and $v$ playing the roles of $r$ and $\omega$, and be done.

In this case, $$ LHS = \e_3 \times ( (x \e_1 + z \e_3) \times \e_3) = \e_3 \times ( -x \e_2 ) = x \e_1 $$ while $$ RHS = 1 (x \e_1 + z \e_3) - (\e_3 \cdot (x \e_1 + z \e_3)) \e_3 = (x \e_1 + z \e_3) - (z) \e_3 = x \e_1, $$ and we see the two sides are equal.

John Hughes
  • 93,729
  • Thanks! I'll try to calculate it. Seems pretty easy now. I should have known that the cross-product is only defined in 3D. – qmd Dec 07 '14 at 19:55
  • The cross product of two vectors is only defined in 3D; in general, the cross product in nD is defined for $n-1$ vectors, so that $\times \begin{bmatrix} x \ y \end{bmatrix}= \begin{bmatrix} -y \ x \end{bmatrix}$, and the cross product in $\mathbb R^4$ takes three vectors. A nice way to define it is to say that for any $u$, $u \cdot \left({\Large \times}{i = 1}^{n-1} v_i\right) = \det A$, where $A$ is a matrix with rows $v_1, \ldots v{n-1}, u$. (You have to prove that there is such a vector, of course!) – John Hughes Dec 07 '14 at 21:11
1

$$\bar{w} \times \bar{r} = \det \begin{pmatrix} \bar{i} & \bar{j} & \bar{k} \\ w_1 & w_2 & w_3 \\ r_1 & r_2 & r_3 \end{pmatrix} = \bar{i} (w_2 r_3 - w_3 r_2) - \bar{j} (w_1 r_3 - w_3 r_1) + \bar{k} (w_1 r_2 - w_2 r_1)$$ Further calculate $\bar{r} \times (\bar{w} \times \bar{r})$ which will be: $$\det \begin{pmatrix} \bar{i} & \bar{j} & \bar{k} \\ r_1 & r_2 & r_3 \\ w_2 r_3-w_3 r_2 & w_3 r_1 - w_1 r_3 & w_1 r_2 - w_2 r_1 \end{pmatrix}$$ I will not further expand it. Let's now calculate RHS. It will be a vector with $i$-th coordinate being equal to $$w_i (r_1^2 + r_2^2+r_3^2) - r_i (w_1 r_1 + w_2 r_2 + w_3 r_3)$$ Which is exactly the same as $i$-th coordinate in the vector that we got above (if you expand determinant).

mathisfun
  • 319