6

The book Linear and Geometric Algebra explains the following theorem in a way that I haven't been able to figure out:

If $\mathbf{A}$ and $\mathbf{B}$ are subspaces of a vector space $\mathbf{B}$ then the set of all combinations $\mathbf{a} + \mathbf{b}$ such that, $\mathbf{a} \in \mathbf{A}$ and $\mathbf{b} \in \mathbf{B}$ is called the span of $\mathbf{A}$ and $\mathbf{B}$, written as $\text{span}(\mathbf{A}, \mathbf{B})$.

Furthermore let $\mathbf{U}^{\bot}$ be the subspace consisting of all vectors orthogonal to a subspace $\mathbf{U}$, in the sense that $\mathbf{u} \in \mathbf{U}^{\top}$ if and only if, $\mathbf{u} \perp \mathbf{v}$ for all vectors $\mathbf{v} \in \mathbf{U}$.

I have of course been able to prove that $\mathbf{U}^{\bot}$ is indeed a subspace for all subspaces $\mathbf{U}$, if this turns out to be useful.

The theorem I want to prove is: if $\mathbf{U}$ is a subspace of $\mathbf{V}$ then $\text{span}(\mathbf{U}, \mathbf{U}^{\perp}) = \mathbf{V}$.

The book mentioned above proves it as follows: If $\text{span}(\mathbf{U}, \mathbf{U}^{\perp}) \neq \mathbf{V}$ then there is a nonzero $\mathbf{u} \perp \text{span}(\mathbf{U}, \mathbf{U}^{\perp})$ because any orthonormal basis for a subspace of an inner product space can be extended into an orthonormal basis for the entire inner product space. In particular $\mathbf{u} \perp \mathbf{U}$, i.e. $\mathbf{u} \in \mathbf{U}^{\perp}$, a contradiction.

I understand how an orthonormal basis for a subspace of an inner product space can be extended into an orthonormal basis for the whole inner product space essentially using Gram-Schmidt orthogonalisation. I don't understand how this process allows you to go from, $\text{span}(\mathbf{U}, \mathbf{U}^{\perp}) \neq \mathbf{V}$ to $\exists \mathbf{u} \in \mathbf{U} : \mathbf{u} \perp \text{span}(\mathbf{U}, \mathbf{U}^{\perp})$. So my question would be how does this implication work?

11Kilobytes
  • 1,099
  • Possible duplicate of http://math.stackexchange.com/questions/878438/finite-dimensional-subspaces-of-inner-product-spaces-are-orthogonally-complement – Surb Aug 13 '14 at 13:34
  • 1
    It seems to me that such statement holds only if $U$ is closed -- this is satisfied if everything is finite dimensional, for example.. – Peter Franek Aug 13 '14 at 13:35
  • Here's a counter-example for infinite dimensional spaces: http://math.stackexchange.com/questions/636517/is-it-true-that-the-whole-space-is-the-direct-sum-of-a-subspace-and-its-orthogon – Surb Aug 13 '14 at 13:37
  • Sure, the book I'm using is only on finite dimensional vector spaces. – 11Kilobytes Aug 13 '14 at 13:37
  • I don't think this question is a duplicate of that one because the proof in question is quite different, that one is constructive, this one is a proof by contradiction. – 11Kilobytes Aug 13 '14 at 13:41

2 Answers2

4

You pick an orthonormal basis for $\text{span}(\mathbf{U}, \mathbf{U}^{\perp}) \neq \mathbf{V}$, say $e_j$, $1\leq j\leq n$. Extend this to an orthonormal basis for $\mathbf{V}$, $e_j$, $1\leq j\leq m$. Since $\text{span}(\mathbf{U}, \mathbf{U}^{\perp}) \neq \mathbf{V}$, $n<m$. But then $u=e_{n+1}\perp\text{span}(\mathbf{U}, \mathbf{U}^{\perp})$, by construction, and thus $u\perp \mathbf{U}$, i.e. $u\in \mathbf{U}^\perp$, and thus $u\perp u$, i.e. $\|u\|=0$. This is a contradiction, since $\|u\|=1$.

0

Note that $Null(U)=U^\perp$ now by the rank-nullity theorem we have $Null(U)+Rank(U)=n$, where n is the dimension of $\mathbb{R}^n$. Now we can use basis extension to combine the basis for $U$ and $U^\perp$. Since $U+U^\perp$ is a subspace of $\mathbb{R}^n$, and the basis for $U+U^\perp$ has n vectors (by rank nullity thorem before), the same amount of vectors needed to span $\mathbb{R}^n$, by the $Dimension\text{ }Theorem$ the basis for $U\cup U^\perp$ is also a basis to $\mathbb{R}^n$ (It takes any set of n linearly independent vectors for them to span a subspace of dimension n). Therefore $U\cup U^\perp$=$\mathbb{R}^n$ I'm not sure how rigorous this proof is, I like the one above better as well. I just forgot this and looked up a proof, but also wanted to find my own after reading Jonas'. This idea definitely works and I thought I'd share it as an alternative.

Goob
  • 391