1

Let $V$ be a vector space of dimension $n$ and let $b:\colon V \times V\to \mathbb{R}$ be a symmetric bilinear function.

Sylvester's theorem says that there exists a basis of $V$ with respect to which the matrix of $b$ is a diagonal matrix with only $1$, $-1$ and $0$ on the diagonal.

My question is, in order to apply Sylvester's argument do I have to start (using other arguments, for instance Lagrange's method ) from a matrix that is already diagonal and after that, find the basis such that I have just $\pm 1$ and $0$ on the diagonal?

M. Vinay
  • 9,004
  • hint : $M_{ij} = b(e_i,e_j)$. what can you say of $M$ ? – reuns May 30 '16 at 08:59
  • $M$ is a symmetric matrix... so $M$ is congruent to a diagonal matrix $D$.. in other words there exists a matrix $P$ such that $P^TAP=D$, my question is: how do I find the matrix $D$ and the matrix $P$? – user343294 May 30 '16 at 09:09
  • it is the spectral theorem, or the SVD – reuns May 30 '16 at 09:11
  • 1
    Sorry... what do you mean with SVD? – user343294 May 30 '16 at 09:14
  • Maybe I understood something.... From the spectral theorem I know that the matrix P is orhogonal, this means that $P^T=P^{-1}$, so to find the matrix D it is enough to find the eigenvalues of $A$ and in order to find the matrix $P$ it is enough to find the eigenvectors of $A$...is it right? – user343294 May 30 '16 at 09:16
  • yes $ $, then you have to answer the main question of the exercice – reuns May 30 '16 at 09:29
  • So... in order to find the Sylvester basis I have to find first the matrix $D$ through the eigenvalues and the matrix $P$ trough the eigenvectors and after I can transform the matrix $D$ in the Sylvester 's form... Is this correct? Thank you very much! – user343294 May 30 '16 at 10:18

1 Answers1

1

Existence in this case is guaranteed by theorems. In case you wish to perform the task for an actual matrix...

It is not necessary to use eigenvalues in this case. Sylvester's Law of Inertia concerns this: $P^T AP = D,$ where $A$ is symmetric, $P$ is nonsingular (but NOT necessarily orthogonal) and $D$ is diagonal. This relationship is often called "congruence," especially if we arrange that the determinant of $P$ be equal to one.

The easiest method is essentially "completing the square," associated with Hermite more or less. There is a method that is entirely cookbook; I collected several examples on this site. Note that I did eventually find a recent book that had this, called Linear Algebra Done Wrong by TREIL. Oh: at the end, you have a diagonal matrix $D.$ One further step, for each nonzero diagonal entry $\lambda$ make another diagonal matrix $Q,$ where the matching element is $1 \sqrt {|\lambda|}.$ Then $Q^T DQ = QTQ$ has diagonal entries all $1,0,-1.$

reference for linear algebra books that teach reverse Hermite method for symmetric matrices

Bilinear Form Diagonalisation

Given a $4\times 4$ symmetric matrix, is there an efficient way to find its eigenvalues and diagonalize it?

Find the transitional matrix that would transform this form to a diagonal form.

Writing an expression as a sum of squares

Determining matrix $A$ and $B$, rectangular matrix

Method of completing squares with 3 variables

Using Lagrange's diagonalization on degenerate linear forms

a question about a canonical form of a quadratic form using Gauss theorem

Will Jagy
  • 139,541