Spectral theorem for Hermitian matrices states that, for an $n\times n$ Hermitian matrix $A$: a) all eigenvalues are real, b) eigenvectors corresponding to distinct eigenvalues are orthogonal, c) there exists an orthogonal basis of the whole space, consisting of eigenvectors.
I can prove a) and b) and I can prove that there exists a basis of the whole space, consisting of eigenvectors. I have even added a proof of the last on this old post. But there is something missing: The proof of orthogonality covers only the cases of distinct eigenvalues, not the cases where some eigenvectors correspond to the same eigenvalue.
I guess, if we apply Gram-Schmidt, we get an orthonormal base, but the new vectors will not necessarily be eigenvectors of $A$ any more (correct me if I am wrong). However, I found here this other proof, which covers the orthogonality issue:
Let $λ_1$ be an eigenvalue, and $x_1$ an eigenvector corresponding to $λ_1$ (every square matrix has an eigenvalue and an eigenvector). Let $V_1$ be the set of all vectors orthogonal to $x_1$. Then $A$ maps $V_1$ into itself: for every $x\in V_{1}$ we also have $Ax\in V_1$. Indeed, $x\in V_{1}$ means that $(x_1, x) = 0$, then we have using (1): $$(x_1, Ax) = (Ax_1, x) = λ_1(x_1, x) = 0$$ so $x\in V_{1}$. Now the linear operator $L(x) = Ax$ when restricted to $V_1$ is also Hermitian, and it has an eigenvalue $λ_2$ and an eigenvector $x_2\in V_1$. By definition of $V_1$, $x_2$ is orthogonal to $x_1$. Let $V_2$ be the orthogonal complement of the span of $x_1$, x2. Then $A$ also maps $V_2$ into itself, as before. Continuing this way, we find a sequence $λ_k$, $x_k$ and subspaces $V_k$ containing $x_k$ such that $V_k$ is orthogonal to $x_1, . . . , x_{k−1}$. The sequence must terminate on the n-th step because $dim V_k = n − k$: on every step dimension decreases by 1. This completes the proof.
It seems right and I can visualise it like this: Let $A$ be $3\times 3$ and let the original space be $\mathbb{R}^3$. There should be at least one eigenvalue and a corresponding eigenvector $x_1$. Then, $V_1$, the set of all vectors orthogonal to $x_1$, will be a plane, perpendicular to $x_1$. Then, $A$, restricted on the plane, will act in the same way as some $2\times 2$ matrix, which will also be hermitian, since it behaves like $A$, and thus, it will have an eigenvalue (not necessarily different from the previews one) and an eigenvector which belongs to the plane. (Actually, it is easy to prove that, the eigenvectors of any $2\times 2$ hermitian matrix, form an orthogonal base). Intuitively, we can extrapolate this view to $\mathbb{R}^n$ and to $\mathbb{C}^n$.
But how can we make this more strict? How do we prove that, the linear operator $L(x) = Ax$, when restricted to $V_1$, can be represented by an $(n-1)\times (n-1)$ matrix, which is also hermitian?
Any references to other proofs of the orthogonality of the eigenvector basis of a hermitian matrix in the case of degenerated eigenvalues, will probably also help me.