1

Let $q:\mathbb{R}^n\to\mathbb{R}$ be a quadratic form: $$q(x_1,\dots,x_n)=\sum_{i=1}^{n} x_i^2+\sum_{1\leq i < j \leq n} x_i x_j$$

I must find the diagonal form of $q$.

My attempt: I tried rewriting $q$ such that we can obtain symmetry:

$$q(x_1,\dots,x_n)=\sum_{i=1}^{n} x_i^2+\sum_{i \neq j} \frac{1}{2} x_i x_j$$

Then the matrix $[q]_e$ (in the standard basis):

$$[q]_e=\begin{pmatrix} 1 & \frac{1}{2}& \frac{1}{2}& \dots & \frac{1}{2}\\ \frac{1}{2} & 1 & \frac{1}{2}& \dots & \frac{1}{2}\\ \vdots & & \ddots & &\vdots\\ \frac{1}{2} & \dots & \frac{1}{2} & \frac{1}{2} &1 \end{pmatrix}$$

Then what I typically do is some operations both on rows and columns to obtain a congruent diagonal matrix. However it gets very messy here and I fail to recognize any pattern. I also tried working with the polynomial itself using Lagrange method, but again, it got very messy. Maybe I'm missing something simple? Any hints?

  • You are making progress. Given the symmetric matrix representation, you should be able to recognize the eigenvalues of the matrix, and these in turn will give "the diagonal form of $q$". Finding the eigenvalues of a matrix like $[q]_e$ have been solved by several previous Questions. – hardmath May 14 '15 at 18:53
  • @hardmath - I haven't learned that method yet. I only know about Lagrange method and making row-column operations on a symmetric matrix. – user239753 May 14 '15 at 18:59
  • @hardmath - as I understand, you propose diagonalizing the matrix via similarity. Even if I do that and obtain the matrix $[q]_w$ (where $w$ is the new basis), I must somehow know the new basis (in order to write the new representation of $q$ explicitly - as a polynomial). When diagonalizing via congruence, if you do the same operations on $I$ (as you do on $[q]_e$) you obtain the so called "change of basis matrix" (or rather its transpose) which allows you to actually retrieve $w$. In the "usual" diagonalization, how would one obtain $w$? – user239753 May 14 '15 at 19:12
  • In the case of a real symmetric matrix, there will be an orthogonal basis of eigenvalues, and change of basis corresponds to the similarity matrix. In this particular case the change of variables is actually simplified because all but one of the eigenvalues are the same, so it only really matters to identify the eigenvector for the odd eigenvalue. – hardmath May 14 '15 at 19:14
  • Here's an "abstract duplicate" that links together many variations on this sort of problem – hardmath May 14 '15 at 19:36

1 Answers1

0

Hint: Use Gauß's method to write the quadratic form as a sum of squares of linear forms, with coefficients $\pm1$.

Bernard
  • 175,478