3

I'm studying a problem from linear algebra:

For a bilinear form $b: V \times V$ and a subvector space $U $ in $ V $ we set $ U^{\perp}:=\{v \in V \mid b(u, v)=0 \text { for all } u \in U\} . $ Now let $ A=\left(\begin{array}{cc}1 & -1 \\ -1 & 1\end{array}\right)$. We consider the bilinear form $ b: \mathbb{R}^{2} \times \mathbb{R}^{2} \longrightarrow \mathbb{R}, \quad(x, y) \mapsto x^{t} A y $ Prove or disprove (by giving a counterexample): For every $ \mathbb{R} $ subvector space $U $ in $ \mathbb{R}^{2}$ we have $ \left(U^{\perp}\right )^{\perp}=U$.

This should be correct because we just deal with finite dimensional vector spaces.

Here is my proof:

Let $ u \in U $ with $\langle u, w\rangle =0 \quad \forall w \in U^{\perp},$ so $ u \in U^{\perp \perp} $.

Consequently $ U \subseteq U^{\perp \perp} $.

Since we know that V is the direct sum of $U$ and $U^{\perp}$: $\operatorname{dim}(U) = \operatorname{dim}(V)-\operatorname{dim}\left(U^{\perp}\right) =\operatorname{dim}(V)-\left(\operatorname{dim}(V)-\operatorname{dim}\left(U^{\perp \perp}\right)\right)=\operatorname{dim} \left(U^{\perp \perp}\right),$ it follows $U=U^{\perp \perp}$.

  • 2
    Your assumption that $V$ is a direct sum of $U$ and $U^\perp$ is not true for every $U$. One hint I can give is that the matrix $A$ defining the form is singular. You might think about what happens for a subspace spanned by a vector in the kernel of $A$ (e.g. (1,1)) – Sam Ballas Oct 05 '23 at 11:25
  • @SamBallas Let $U = \text{span}(\begin{bmatrix} 1 \ 1 \end{bmatrix})$ and by the definition of $ U^{\perp}:={v \in V \mid b(u, v)=0 \text { for all } u \in U}$ I think that $ U^{\perp} = \text{span}(\begin{bmatrix} -1 \ 1 \end{bmatrix})$ and $\left(U^{\perp}\right )^{\perp} = \left(\text{span}(\begin{bmatrix} -1 \ 1 \end{bmatrix})\right)^{\perp} = \text{span}(\begin{bmatrix} 1 \ -1 \end{bmatrix}) \neq U$ – Marius Lutter Oct 05 '23 at 14:17
  • I found this problem referring to my question https://math.stackexchange.com/questions/636517/is-it-true-that-the-whole-space-is-the-direct-sum-of-a-subspace-and-its-orthogon

    But I want to use the example given in the task

    – Marius Lutter Oct 05 '23 at 14:18
  • 2
    Not quite. In this case $U^\perp=\mathbb{R}^2$, since it contains the vectors $(1,-1)$ AND $(1,1)$. In fact the vector $(1,1)$ is in the kernel of $A$ and so it is orthogonal to EVERY vector in $\mathbb{R}^2$, so whatever $(U^\perp)^\perp$ is it will contain $(1,1)$. – Sam Ballas Oct 05 '23 at 15:57
  • @SamBallas Right! Just posted my answer, let me know what you think – Marius Lutter Oct 06 '23 at 08:06

1 Answers1

1

$\qquad$Disproof !

The specific bilinear form you propose allows for the canonical counter-example $\,U = \{0\}\,$ since the defining matrix $A$ is singular, which corresponds to the associated bilinear form being degenerate.
We clearly & always have $\,U^\perp = \{0\}^\perp = V$, but then & here $$\left(U^{\perp}\right )^{\perp}\;=\; V^\perp\;=\; \langle{1\choose 1}\rangle \;\neq\;\{0\}=U.$$ Going into more detail:
The defining matrix $A$ has the eigenvalues $\{0,2\}$ with respective (non-normalised) eigenvectors ${1\choose 1}$ and ${1\choose -1}\,$. As a statement about diagonalisation of $A$ this may be written as $$\begin{pmatrix}1 & -1 \\ -1 & 1\end{pmatrix} \;\frac1{\sqrt 2}\begin{pmatrix}1& 1\\ 1& -1\end{pmatrix} \;=\; \frac1{\sqrt 2}\begin{pmatrix}1& 1\\ 1& -1\end{pmatrix} \begin{pmatrix} 0& 0\\ 0& 2\end{pmatrix}$$

Regarding your proof attempt:
Your first conclusion $U\subseteq U^{\perp \perp}$ is true.

In case of degeneracy, $V$ is in general not the direct sum, e.g., it does not fit for $U^{\perp}$ and $U^{\perp\perp}\,$.

Hanno
  • 6,302