0

Let $\langle \cdot , \cdot\rangle :\mathbb{R}^n\times\mathbb{R}^n \rightarrow \mathbb{R}$ denote stranded inner product on $\mathbb{R}^n$.

For any nonzero $ w\in\mathbb{R}$ define $T_w:\mathbb{R}^n\rightarrow\mathbb{R}^n$

$T_w(v)=v-\frac{2 \langle v,w \rangle }{\langle w,w \rangle}w $, for $v\in\mathbb{R}^n$ Which of the following are true?

1) $\det(T_w)=1$

2) $\langle T_w(v_1),Tw(v_2)\rangle = \langle v_1,v_2 \rangle \forall v_1,v_2\in \mathbb{R}^n$

3) $T_w=T_w^{-1}$

4) $T_{2w}=2T_w$

.The answer to above questions are 2,3 as per the solution manual.

I have no idea regarding how one can connect inner product and determinant. Help Please..!!

Theo Bendit
  • 50,900
  • What are $det$ and the inverse "$^{-1}$" defined in 1) and 3)? – Math-fun Jun 18 '19 at 14:00
  • When you wrote "stranded inner product" did you mean "standard inner product"? – J. W. Tanner Jun 18 '19 at 14:01
  • Option 1) you can see the det .. in 3 the inverse is for w –  Jun 18 '19 at 14:02
  • Related : https://math.stackexchange.com/questions/1537104/let-u-be-a-real-n-times-1-vector-satisying-ut-u-1-and-a-i-2uut-t – Arnaud D. Jun 18 '19 at 14:04
  • @Math-fun I'd say the determinant is just the determinant of any matrix representation. Then you can probably show that it's bijective and so the inverse is well defined. – Paulo Mourão Jun 18 '19 at 14:07
  • The case of the determinant is also treated here : https://math.stackexchange.com/questions/504199/prove-that-the-determinant-of-a-householder-matrix-is-1 – Arnaud D. Jun 18 '19 at 14:10

2 Answers2

1

Let $e_1=w$ and let $\{e_2,\ldots,e_n\}$ be a basis of $w^\perp$. Then $t_w(e_1)=-e_1$ and $t_w(e_k)=e_k$ if $k>1$. Therefore, the matrix of $T_w$ with respect to the basis $\{e_1,e_2,\ldots,e_n\}$ is$$\begin{bmatrix}-1&0&0&\ldots&0\\0&1&0&\ldots&0\\0&0&1&\ldots&0\\\vdots&\vdots&\vdots&\ddots&\vdots\\0&0&0&\ldots&1\end{bmatrix},$$whose determinant is $-1$.

1

Recall that $\frac{\langle v, w \rangle}{\langle w, w \rangle} w$ is the projection of $v$ onto $w$ (or, more precisely, the line spanned by $w$). Therefore, $v - \frac{\langle v, w \rangle}{\langle w, w \rangle} w$ is the orthogonal component of $v$ to $w$. In particular, it is the projection onto $\operatorname{span}(w)^\perp$. Note that subtracting $\frac{\langle v, w \rangle}{\langle w, w \rangle} w$ from $v$ leads us to the point of projection, so subtracting the same vector once more will take us to the reflection of $v$ in the hyperplane $\operatorname{span}(w)^\perp$. Geometrically speaking, this is what $T_w$ does.

This tells us everything we need to know in the question.

1) The determinant must always be $-1$. A reflection has two eigenvalues: $1$ and $-1$. The eigenspace corresponding to $1$ is the "mirror" (in this case, $\operatorname{span}(w)^\perp$; note its dimension is $n - 1$). The eigenspace corresponding to $-1$ is the perpendicular complement to the "mirror" (in this case, $\operatorname{span}(w)$, of dimension $1$). Thus, we have the eigenvalue $1$ with multiplicity $n - 1$, and the eigenvalue $-1$ with multiplicity $1$. Thus, the determinant is $$\operatorname{det}(T_w) = 1^{n - 1}(-1)^1 = -1.$$

2) Reflections preserve distances, and hence inner products (well, when they're orthogonal, like this one).

3) Reflections are their own inverse; undoing a reflection is as simple as doing it again.

4) As $w$ only is a normal vector for the hyperplane mirror $\operatorname{span}(w)^\perp$, it doesn't matter if we scale $w$ by $2$. It is the case that $\operatorname{span}(w) = \operatorname{span}(2w)$, so $T_w = T_{2w}$. Note the lack of a multiple of $2$.

These can all be done algebraically too (and arguably more simply), but I think it's instructive to get a geometric perspective too.

Theo Bendit
  • 50,900