0

In the appendix of a book on linear and nonlinear optimization, I saw the following statement on symmetric matrices:

enter image description here

In these notes, $A$ is considered as a $n \times n$ square matrix and $E^n$ is the $n$ dimensional Euclidean space. I already know that the repeated eigenvalues of a symmetric matrix corresponds to linearly independent eigenvectors, but they are not necessarily orthogonal. Then how can be the third statement correct, if $A$ has repeated eigenvalues? I think this is a misstatement, but I wonder if there is something I am missing.

  • 1
    $I$ has all repeated eigenvalues, but $\mathbb R^n$ still has an orthogonal basis containing only eigenvectors of $I$... – 5xum Apr 06 '18 at 10:46
  • Then are the eigenvalues corresponding to repeated eigenvalues are orthogonal as well, for a symmetric matrix? – Ufuk Can Bicici Apr 06 '18 at 10:57
  • 2
    Ah ok, the eigenvectors for the same eigenvalue are linearly indepedenent and constitute a subspace with the dimension of the eigenvalue's multiplicity. We can pick orthogonal vectors in this subspace such that they are still eigenvectors to the matrix $A$. – Ufuk Can Bicici Apr 06 '18 at 11:04
  • 1
    Exactly! And the subspace is orthogonal to any of the eigenvectors belonging to other eigenvalues. – 5xum Apr 06 '18 at 11:17
  • Related: https://math.stackexchange.com/questions/482599/why-are-real-symmetric-matrices-diagonalizable – Hans Lundmark Apr 19 '21 at 06:40

1 Answers1

2

I understood what I was missing, so I am writing a self answer. Let a eigenvalue $\lambda$'s multiplicity be $m$ for a symmetrix matrix $A$. We have $m$ eigenvectors $x_1, \dots, x_m$, corresponding to this eigenvalue and we know that they are linearly independent. We can produce orthogonal vectors $v_1, \dots, v_m$ in the subspace spanned by $x_1, \dots, x_m$, which are all linear combinations of $x_1, \dots, x_m$. (Gram Schmidt Orthogonalization, for example). $v_1, \dots, v_m$ are still eigenvectors for $A$. Pick $v_i$ among them, for example. We have $v_i = \sum_{j=1}^{m}a_jx_j$ and not all $a_j=0$. Then we have $Av_i= A(\sum_{j=1}^{m}a_jx_j) = \sum_{j=1}^{m}a_jAx_j = \sum_{j=1}^{m}a_j\lambda x_j = \lambda v_i$.