4

I am studying some linear algebra and I've learned one algorithm to transform a bilinear symmetric form in a diagonal matrix, i.e., it is a sum of squares. But to do this, I am using a kind of Gramm-Schimdt process, I mean:

Take $\{e_1,\ldots,e_n\}$ basis for $V$ a vector field, and consider $f : V\times V \to V$ a symmetric bilinear form. How to find a basis in a such way that $f$ is a sum of squares?

What I've learned is:

Take $e_1$, then make:

$v_1 = e_1.$

Do then

$$v_2 = e_2 - \frac{f(v_1,e_2)}{f(v_1,v_1)}v_1,$$

$$v_3 = e_3 - \frac{f(v_1,e_3)}{f(v_1,v_1)}v_1 - \frac{f(v_2,e_3)}{f(v_2,v_2)}v_2$$

proceed as the same way. Then we get the searched basis.

But my question is, how to do this procedure using only the matrix of $f$? Is this equivalent to find a basis formed by eingenvectors?

  • No. See http://math.stackexchange.com/questions/395634/given-a-4-times-4-symmetric-matrix-is-there-an-efficient-way-to-find-its-eige/1392600#1392600 – Will Jagy Jul 07 '16 at 20:52
  • @WillJagy Isn't it? He asked about eigenvector, not eigenvalues. – Llohann Jul 07 '16 at 21:55
  • @Llohann I'm not sure what he is asking. I wish he would give a matrix together with the kind of answer he wants, then another matrix where he has trouble achieving that. He says he wants to do, well, something using a matrix. Then he asks whether that something is the same as finding eigenvectors; if he wants to solve $P^T A P = D$ with $D$ diagonal, it is not the same thing. If he wants to copy the Gram-Schmidt process using just the matrix.... – Will Jagy Jul 07 '16 at 22:01
  • I probably have some idea of what he wants. I will post some answer and let us see. – Llohann Jul 07 '16 at 22:08
  • @WillJagy, would you please see my comment at Llohann`s answer? I think it will be more clear. – L.F. Cavenaghi Jul 07 '16 at 23:50

1 Answers1

2

Leonardo,

As Will Jagy's comment points out, possibly one subtle point in your question is 'the matrix of $f$': given a square matrix $A$ on $\mathbb R^n$ (let us keep real for simplicity), you get a bilinear form $$f_A(x,y)=\langle Ax,y\rangle,$$ where $\langle,\rangle$ is the standard Euclidean product on $\mathbb R^n$. On the other hand, given a symmetric bilinear form $f:\mathbb R^n\times\mathbb R^n\to \mathbb R$, one get a unique (symmetric) matrix $A=\{a_{ij}\}$ by setting $$a_{ij}=f (e_i,e_j)$$ ($\{e_i\}$ the standard basis in $\mathbb R^n$.) In this case it is easy to see that $f=f_A$. In particular, if $\{v_i\}$ is a set of ortonormal eigenvectors of $A$, $Av_i=\lambda_i v_i$, $$f(v_i,v_j)=\langle Av_i,v_j\rangle=\lambda_i\langle v_i,v_j\rangle=\lambda_i\delta_{ij}.$$

When you proceed as Gram-Schmidt, you are actually considering $f$ as a (possibly non-definite degenerate) inner product.

Does it answer your question? I can't understand what do you mean by a sum of squares.

Llohann
  • 755
  • thank you! What I am asking is, is the procedure I stated the correct one to diagonalize the matrix of a bilinear form? Or there is some calculations involving only the matrix $A$ as taking determinants and etc? In practice, which is the way to diagonalize a bilinear form? – L.F. Cavenaghi Jul 07 '16 at 23:49
  • @LeonardoFranciscoCavenaghi you really should take a careful look at http://math.stackexchange.com/questions/395634/given-a-4-times-4-symmetric-matrix-is-there-an-efficient-way-to-find-its-eige/1392600#1392600 which shows what to do with a symmetric matrix on its own merits. After that you can probably make an understandable question. – Will Jagy Jul 08 '16 at 17:23