2

Note: This is a homework question.

After pages of attempts and failures, here I am. First, I will present the question then state what I have tried.

The question:

Let $u$ be a non-zero vector in $\mathbb{R}^n$. Let $L = \mbox{Span}\{u\}$. Define a map, $R:\mathbb{R}^n\rightarrow\mathbb{R}^n$ by

$$R(x) = 2proj_Lx - x$$

for $x\in\mathbb{R}^n$.

Show that $R$ can be represented by an orthogonal matrix $Q$, state what the matrix is and show that it is orthogonal. (This matrix $Q$ will involve the vector $u$ and the identity matrix $I$.)

Attempted solution:

Since $u$ is the basis for $L$, we can rewrite $proj_Lx$ as $proj_ux$. Then, since this seems like a problem dealing with a reflection over $L$, I used the following reflection matrix:

$$\begin{bmatrix} cos(\theta)&sin(\theta)\\ sin(\theta)&-cos(\theta)\\ \end{bmatrix}$$

Where:

$cos(\theta) = \frac{x\cdot proj_ux}{\left|\left|x\right|\right|\,\left|\left|proj_ux\right|\right|}u = \frac{x\cdot\frac{<u,x>}{<u,u>}u}{\left|\left|x\right|\right|\,\left|\left|\frac{<u,x>}{<u,u>}u\right|\right|} = \frac{x}{\left|\left|x\right|\right|}\cdot\frac{<u,x>u}{\left|\left|<u,x>u\right|\right|}$

$sin(\theta) = \sqrt{\left(\frac{x\cdot proj_ux}{\left|\left|x\right|\right|\,\left|\left|proj_ux\right|\right|}u\right)^2-1} = \sqrt{\left(\frac{x\cdot\frac{<u,x>}{<u,u>}u}{\left|\left|x\right|\right|\,\left|\left|\frac{<u,x>}{<u,u>}u\right|\right|}\right)^2-1} = \sqrt{\left(\frac{x}{\left|\left|x\right|\right|}\cdot\frac{<u,x>u}{\left|\left|<u,x>u\right|\right|}\right)^2-1}$

However, once I get here, I get a bit lost in how I might be able to continue, especially since the problem states that $Q$ will involve the identity matrix $I$. I feel there should be a better way of approaching this but I certainly can't come up with anything.

Thank you for any help you may be able to provide.

John
  • 863
  • The mapping $x\mapsto x- 2proj_L(x)$ is the reflection with respect to the hyperplane that has $u$ as its normal vector (see e.g. here). Reflections preserve angles and lengths, so are represented by an orthogonal matrix. This mapping is a negated version, and thus orthogonal as a compositum of two orthogonals. – Jyrki Lahtonen Oct 02 '14 at 09:31

5 Answers5

1

Assume $x \in \mathbb{R}^{n}\setminus\{0\}$. Let $l$ be the line $\{ t x : t \in\mathbb{R}\}$. The closest point projection of $y$ onto the line $l$ is the unique point $\alpha x$ (where $\alpha \in\mathbb{R}$) such that $(y-\alpha x)\perp x$. Using inner-product $(\cdot,\cdot)$, this gives $\alpha(x,x)=(y,x)$, and $$ Py = \frac{(y,x)}{(x,x)}x. $$ The vector from $y$ to the projection $Py$ onto the line $l$ is orthogonal to the line. So $(y-Py)\perp Py$, which gives the orthogonal decomposition $y=(y-Py)+Py$. So, by the Pythagorean Theorem, $$ \begin{align} \|y\|^{2} & = \|(y-Py)+Py\|^{2} \\ & = \|y-Py\|^{2}+\|Py\|^{2} \\ & = \|(Py-y)+Py\|^{2}=\|(2P-I)y\|^{2}. \end{align} $$ Your operator is $A=2P-I$, and the above shows that this operator is isometric and, therefore, has an orthogonal matrix representation. This operator is $I$ on the one-dimensional space spanned by $x$ and is $-I$ on the vectors which are orthogonal to $x$.

You can write $A=UDU$ where $U$ is orthogonal and $D$ is the diagonal matrix with a $1$ in the upper left corner and $-1$'s on the rest of the diagonal, provided you choose the first column of $U$ to be the representation of $x$, and the remaining columns of $U$ to be representations of an orthonormal basis of vectors which are orthogonal to $x$.

Disintegrating By Parts
  • 87,459
  • 5
  • 65
  • 149
  • @ T.A.E. , I give you $x=[1,2,5,7]^T$. Can you obtain $A$ in few seconds ? If not, have a look on my post below. The most important fact is that the matrix $A$ does not depend on the chosen orthonormal basis of the orthogonal of $x$. –  Oct 05 '14 at 09:37
  • @loupblanc : For any orthogonal projection $P$--regardless of the dimension of the range of $P$--the above argument above shows $A=2P-I$ is orthogonal. On the range of $P$, one has $P=I$ and, so, $A=I$; on the orthogonal complement of the range of $P$, one has $P=0$ and hence $A=-I$. You shouldn't be surprised that the representation of the identity operator on a subspace does not vary with how you choose a basis for the subspace. The most important thing for the splitting is choosing an orthonormal basis where a basis element is either in the range of $P$ or in its orthogonal complement. – Disintegrating By Parts Oct 05 '14 at 11:07
  • @ T.A.E. I agree with you on the first part of your comment. It is the reason because you method is not good. In particular, there is absolutely no choice of orthonormal basis to do. Read my post below. –  Oct 05 '14 at 11:36
0

Just restating Jyrki's comment as an answer with a memorable name: this is a reflection in the (hyper-)plane and its matrix is the (negative) Householder matrix which is easily orthogonal.

rych
  • 4,205
0

I am not sure how you are defining your $proj$ but chose a basis that starts with $u$ and has the other elements orthogonal to $u$. Then the matrix in this basis is

$$\begin{pmatrix} 1&0&0\\ 0&-1&0\\ 0&0&-1\\ \end{pmatrix}$$

  • This matrix is not orthogonal, and you also seem to assume that $n=3$. – Santiago Canez Oct 02 '14 at 01:16
  • @SantiagoCanez n=3 is just ease of writting. – Rene Schipperus Oct 02 '14 at 01:20
  • @SantiagoCanez O I see what you mean, its just a typo, NOW its orthogonal. – Rene Schipperus Oct 02 '14 at 01:24
  • Thank you Rene but I'm not sure how you came up with the matrix, nor how I would generalize it to $\mathbb{R}^n$. – John Oct 02 '14 at 01:57
  • In higher dimensions there are just more $-1$'s. Proj fixes the first element $u$ and maps all the other elements of the basis to zero, you can now calulate how your function operates. – Rene Schipperus Oct 02 '14 at 01:59
  • Hello Rene, thanks again but I'm afraid I'm completely lost on this still and have absolutely no idea how to go about it. I'm sure I'm missing something somewhat trivial. Any extra help would be really appreciated. – John Oct 02 '14 at 02:54
0

Let $y=R(x)$ and $y=2p(x)-x$. Then $(x+y)/2=p(x)$ and $R$ is the orthogonal symmetry with respect to $u$ (in the direction of $orthog(u)$). Finally $R\in O(n)$ and $\det(R)=(-1)^{n-1}$ (because $R=id$ on $span(u)$ and is $-id$ on $orthog(u)$).

EDIT 1: if you want an explicit form: Let $u=[u_1,\cdots,u_n]^T$ where $u$ is assumed to be a unitary vector. Then $R=2P-I$ where $P=[p_{i,j}]$ with $p_{i,j}=u_iu_j$. Note that $R$ is a symmetric matrix.

EDIT 2: for instance , let $u=[1,2,5,7]^T$. Then $R=2P-I$ where $P=\dfrac{1}{||u||^2}\begin{pmatrix}1&2&5&7\\2&4&10&14\\5&10&25&35\\7&14&35&49\end{pmatrix}$.

0

Let $M$ be the orthogonal complement of the one-dimensional subspace $L$. Then ${\rm dim}(M)=n-1$.

Any $x\in{\mathbb R}^n$ can be decomposed in a unique way as $x=x'+x''$ with $x'\in L$, $x''\in M$. Furthermore one has $$R(x')=2{\rm proj}_L(x') -x'=2x'-x'={\rm id}(x')\qquad(x'\in L)$$ and $$R(x'')=2{\rm proj}_L(x'') -x''=0-x''=-{\rm id}(x'')\qquad(x''\in M)\ .$$ With respect to an orthonormal basis of ${\mathbb R}^n$ with $e_1\in L$ and $e_k\in M$ $\>(2\leq k\leq n)$ the map $R$ therefore has the matrix $D:={\rm diag}(1,-1,-1,\ldots, -1)$, which is certainly orthogonal. With respect to the standard basis of ${\mathbb R}^n$ the matrix of $R$ is then given by $$[R]=T'\>D\>T\ ,$$ whereby the standard coordinates of the vectors $e_i$ are written in the columns of $T$. Since $T$ is also an orthogonal matrix so is $[R]$.