4

This is an old question, and the proof is here

The proof assumed different eigenvalues with different eigenvectors.

My question is how about the repeated root? How to guarantee there will not be only one independent eigenvector such that all eigenvectors can form the orthogonal basis of the vector space?

sleeve chen
  • 8,281
  • I don't understand your question. However, on the matter of eigenvalues not being distinct, eigenvectors with the same eigenvalue are certainly not always orthogonal. If you have two orthogonal eigenvectors with the same eigenvalue, then every linear combination of them is another eigenvector with that same eigenvalue, and is not generally orthogonal to the two you started with. ${}\qquad{}$ – Michael Hardy Jul 21 '15 at 18:17
  • @Michael Hardy My question is just to check if geometric multiplicity < algebraic multiplicity in the case of symmetric matrix. – sleeve chen Jul 21 '15 at 21:42

3 Answers3

8

There are really three things going on here:

  1. Eigenvectors corresponding to distinct eigenvalues are all orthogonal.
  2. A symmetric matrix is diagonalizable whether it has distinct eigenvalues or not. @A.G. proved this just fine already.
  3. Given a subspace whose dimension is greater than $1$, one can choose a basis of the subspace consisting of orthogonal elements. This is usually proven constructively by applying Gram-Schmidt.

Thus, it is not the case that all pairs of non-parallel eigenvectors of every symmetric matrix are orthogonal to each other. Rather, one can choose an orthogonal basis such that the matrix is diagonal in that basis. Nonetheless, for a symmetric matrix with a repeated eigenvalue, one can also choose a non-orthogonal basis such that the matrix is diagonal in that basis.

Ian
  • 101,645
4

Assume that for a symmetric matrix $A$ there exists a Jordan block for an eigenvalue $\lambda$ of size more than one, hence there exists at least two linear independent generalized eigenvectors, i.e. $By=x$ and $Bx=0$ where $B=A-\lambda I$. Estimate $x^TBy$. On one hand it is $0^Ty=0$, on other hand, it is $x^Tx=\|x\|^2$. It gives $x=0$ which is a contradiction with the vectors being linear independent. Hence all chains of generalized eigenvectors are of length one, i.e. they are eigenvectors for $A$.

Addendum: As @Ian correctly noticed, one has to add to the proof that the basis of the corresponding eigen-subspace for $\lambda$ can be chosen orthogonal.

A.Γ.
  • 29,518
  • I honestly don't see what this has to do with the question. It seems to be a (correct) proof that a symmetric matrix is diagonalizable, but to say nothing about orthogonality. – Ian Jul 21 '15 at 17:11
  • @ian Sorry, I missed to mention that one can do orthogonalization within a corresponding eigen-subspace. – A.Γ. Jul 21 '15 at 17:20
3

An alternative approach to the proof (not using the inner-product method on the question you reference) is to use Schur's Theorem.

Schur's Theorem: Every square matrix $A$ has a factorization of the form $A=QTQ^{\ast}$ where $Q$ is a unitary matrix and $T$ is upper triangular.

Then, if $A$ is symmetric, $T$ must also be symmetric (and hence diagonal). The columns of $Q$ are the eigenvectors of $A$ (easy to check), $T$ contains the eigenvalues (easy to check), and since $Q$ is unitary, all the columns are orthonormal.

TravisJ
  • 7,426
  • 2
    If you want a reference, I have on my desk: "Numerical Linear Algebra" by Trefethen and Bau (published by SIAM). Lecture 24 covers eigenvalues problems and has this result. – TravisJ Jul 21 '15 at 17:13