2

I'm confused about orthogonal bases with respect to a symmetric bilinear form $\phi$.

Consider the quadratic form

$Q: \mathbb{R^{4}} \rightarrow \mathbb{R}$ defined as:

$Q(\vec{x})=-x_2^2+2x_1x_3+x_4^2$

Which is associated to a symmetric bilinear form $\phi$ defined as:

$\phi(\vec{x},\vec{y})=-x_2y_2+x_4y_4+x_1y_3+x_3y_1$

Then consider this subspace $W= \Big\{ {(x_1,x_2,x_3,x_4) \in \mathbb{R^{4}} \mid x_4=2x_1-x_2+4x_3=0 \Big\} }= \mathscr{L}((1,2,0,0),(0,4,1,0))$

Determine a basis $\mathscr{C}=(\vec{c_1},\vec{c_2})$ of $W$ orthogonal with respect to $\phi$.

(The answer is $\mathscr{C}=((1,2,0,0),(-\frac{7}{4},\frac{1}{2},1,0))$)

I started considering $\vec{c_1}=(1,2,0,0)$ but then I don't know how to determine a second vector of $W$ which is orthogonal to it.

My idea was to apply Gram-Schmidt process using $\phi$ instead of the standard dot product but it doesn't work and I can't understand why!

In general what is the method to determine an orthogonal basis of a subspace with respect to $\phi$?

Thanks a lot for your help

Gianolepo
  • 2,507
  • 2
  • 21
  • 38
  • You say that the Gram-Schmidt process didn’t work. I’d be interested in seeing the details, since it worked for me, so it’s likely you made a mistake somewhere along the way. – amd Dec 08 '15 at 19:43

2 Answers2

1

Just use the definitions. You are looking for a vector

$$ \vec{c}_2 = t(1,2,0,0) + s(0,4,1,0) $$

such that

$$ 0 = \phi(\vec{c}_1, \vec{c}_2) = \phi((1,2,0,0), t(1,2,0,0) + s(0,4,1,0)) = t\phi((1,2,0,0), (1,2,0,0)) + s\phi((1,2,0,0), (0,4,1,0)) = -4t -7s. $$

Hence, $4t + 7s = 0$. Taking $s = 1$, we get $t = -\frac{7}{4}$ and

$$ \vec{c}_2 = \left( -\frac{7}{4}, -\frac{7}{2}, 0, 0 \right) + (0,4,1,0) = \left( -\frac{7}{4}, \frac{1}{2}, 1, 0 \right). $$

In general, there is an algorithm that given a bilinear form on $\mathbb{F}^n$ (with $\mathrm{char} \mathbb{F} \neq 2$), generates an orthogonal basis with respect to the form. It works by applying simultaneous row and column operations on the symmetric matrix representing the bilinear form and is, in some sense, a generalization of the regular Gram-Schmidt procedure. However, in your case since you are working with a two dimensional space, this is quite an overkill.

levap
  • 65,634
  • 5
  • 79
  • 122
  • To see an example of the algorithm in action, read here: http://math.stackexchange.com/questions/329304/bilinear-form-diagonalisation – levap Dec 08 '15 at 16:47
  • Thanks so much for the answer! I'm used to find an orthogonal basis with respect to $\phi$ using eigenvectors and the spectral theorem (so diagonalization for similarity), the problem here was that I had to find an orthogonal basis not of the vector space were $Q$ is defined (which is $\mathbb{R^{4}}$) but of a subspace of it. I read the question but (for what I understood) it talks about finding a basis for the vector space (with diagonalization by congruence). What I cannot understand is how these processes (preferably diagonalization by similarity) can be used also for subspaces – Gianolepo Dec 08 '15 at 17:19
  • 1
    Another way of solving this question (which is an overkill, but is worth to understand in order to apply in more complicated situations) is to write down the matrix $A$ representing the bilinear form $\phi$ restricted to $W$. It will be a $2 \times 2$ symmetric matrix with entries $\phi(\vec{v}_i, \vec{v}_j)$ where $B = (v_i)$ is some ordered basis for $W$. Then, you can diagonalize $A$ by similarity and write $P^T A P = D$. The columns of $P$ will then represent an orthogonal basis of $W$ - they will be the coordinate vectors with respect to the basis $B$ of a $\phi$-orthogonal basis. – levap Dec 08 '15 at 17:32
  • 1
    This is very similar to how one diagonalizes a linear operator $T \colon V \rightarrow V$. You can choose some arbitrary (but preferably as nice as possible) basis $B$ for $V$, write the matrix $A$ representing $T$ with respect to $B$, diagonalize $A$ and write $P^{-1}AP = D$. Then, the columns of $P$ will represent the eigenvectors of $T$ with respect to the basis $B$. – levap Dec 08 '15 at 17:34
  • Thanks again, very kind! So in this case I can take the matrix of the restriction of $\phi$ to $W$ $A=\left[ {\begin{array} \-4&-7\-7&-16 \end{array}} \right]$ and diagonalize it using spectral theorem, then the basis $B$ with respect to which $A$ is diagonal, is an orthogonal basis for $\phi$ of $W$. Is this right? (Of course your method was better in this case!) – Gianolepo Dec 08 '15 at 17:46
  • 3
    There are three notions you shouldn't confuse - orthogonal diagonalization, diagonalization and diagonalization by congruence. Diagonalizing $A$ means finding invertible $P$ such that $P^{-1}AP$ is diagonal. Orthogonally diagonalizing means finding orthogonal $O$ such that $O^{-1}AO = O^TAO$ is diagonal. Finally, diagonalizing $A$ by congruence means finding invertible $P$ such that $P^TAP$ is diagonal. The notion that is relevant in your case is diagonalization by congruence. – levap Dec 08 '15 at 17:54
  • 1
    Now, performing diagonalization by congruence is much easier than performing orthogonal diagonalizaion. You don't need to solve an equation to find the eigenvalues, then solve a system of linear equations to find a basis of eigenvectors. You just perform simultaneous row and column operations. You can diagonalize $A$ orthogonally if you want, it will work, but it is not necessary and in more complex cases, will be generally impossible to perform precisely (you won't even be able to find the roots of the characteristic polynomial to determine the eigenvalues). – levap Dec 08 '15 at 17:57
  • 1
    Finally, if $P^TAP = D$ is diagonal, then the columns of $P$ won't be an orthogonal basis for $W$. It doesn't make sense directly - the columns of $P$ are $2 \times 1$ vectors and elements in $W$ are $1 \times 4$ vectors. However, the columns of $P$ represent a basis of $W$. If the first column if $(3,4)$ then the corresponding vector in $W$ will be $3 \cdot (1,2,0,0) + 4(0,4,1,0)$. – levap Dec 08 '15 at 18:00
  • I'm sorry for the confusion, with "using the spectral theorem" I mean orthogonal diagonalization, because I mean that I find a $O$ orthogonal such that $D=O^{-1}AO=O^TAO$ where D is diagonal, so this is the process that I use always (and it's the only one that I actually know) – Gianolepo Dec 08 '15 at 18:06
  • Anyway I think that's all clear now, thanks!! Indeed you are right, I tried to calculate the eigenvalues and eigenvectors of $A$ with Mathematica and very ugly numbers came out, so probably that's not the easiest way to solve the problem! – Gianolepo Dec 08 '15 at 18:10
  • One last thing, if I may: supposing that I had to find an orthonormal basis, that would be impossible since $Q$ is not definite positive and neither is its restriction to $W$ right? – Gianolepo Dec 08 '15 at 18:15
1

Gram-Schmidt using $\phi$ as the scalar product should have worked for you.

Let $v_1=u_1=(1,2,0,0)^T$ and $v_2=(0,4,1,0)^T$, so that $\phi(u_1,u_1)=-4$ and $\phi(u_1,v_2)=-7$. Then we have $$\begin{align} u_2 &= v_2-{\phi(u_1,v_2)\over\phi(u_1,u_1)}u_1 \\ &=(0,4,1,0)^T-{-7\over-4}(1,2,0,0)^T \\ &=\left(-\frac74,\frac12,1,0\right)^T. \end{align}$$

amd
  • 53,693
  • I've done the same thing but I tried to calculate $vers(v_1)=\frac{v_1}{||v_1||}$ and $||v_1||=\sqrt{\phi(v_1,v_1)}$ which cannot be ($\phi$ is indefinite). But doing it in this way you avoid the square root and it works, so that was a solution to the problem in the end, thanks! – Gianolepo Dec 08 '15 at 20:46