1

Spectral theorem for Hermitian matrices states that, for an $n\times n$ Hermitian matrix $A$: a) all eigenvalues are real, b) eigenvectors corresponding to distinct eigenvalues are orthogonal, c) there exists an orthogonal basis of the whole space, consisting of eigenvectors.

I can prove a) and b) and I can prove that there exists a basis of the whole space, consisting of eigenvectors. I have even added a proof of the last on this old post. But there is something missing: The proof of orthogonality covers only the cases of distinct eigenvalues, not the cases where some eigenvectors correspond to the same eigenvalue.

I guess, if we apply Gram-Schmidt, we get an orthonormal base, but the new vectors will not necessarily be eigenvectors of $A$ any more (correct me if I am wrong). However, I found here this other proof, which covers the orthogonality issue:

Let $λ_1$ be an eigenvalue, and $x_1$ an eigenvector corresponding to $λ_1$ (every square matrix has an eigenvalue and an eigenvector). Let $V_1$ be the set of all vectors orthogonal to $x_1$. Then $A$ maps $V_1$ into itself: for every $x\in V_{1}$ we also have $Ax\in V_1$. Indeed, $x\in V_{1}$ means that $(x_1, x) = 0$, then we have using (1): $$(x_1, Ax) = (Ax_1, x) = λ_1(x_1, x) = 0$$ so $x\in V_{1}$. Now the linear operator $L(x) = Ax$ when restricted to $V_1$ is also Hermitian, and it has an eigenvalue $λ_2$ and an eigenvector $x_2\in V_1$. By definition of $V_1$, $x_2$ is orthogonal to $x_1$. Let $V_2$ be the orthogonal complement of the span of $x_1$, x2. Then $A$ also maps $V_2$ into itself, as before. Continuing this way, we find a sequence $λ_k$, $x_k$ and subspaces $V_k$ containing $x_k$ such that $V_k$ is orthogonal to $x_1, . . . , x_{k−1}$. The sequence must terminate on the n-th step because $dim V_k = n − k$: on every step dimension decreases by 1. This completes the proof.

It seems right and I can visualise it like this: Let $A$ be $3\times 3$ and let the original space be $\mathbb{R}^3$. There should be at least one eigenvalue and a corresponding eigenvector $x_1$. Then, $V_1$, the set of all vectors orthogonal to $x_1$, will be a plane, perpendicular to $x_1$. Then, $A$, restricted on the plane, will act in the same way as some $2\times 2$ matrix, which will also be hermitian, since it behaves like $A$, and thus, it will have an eigenvalue (not necessarily different from the previews one) and an eigenvector which belongs to the plane. (Actually, it is easy to prove that, the eigenvectors of any $2\times 2$ hermitian matrix, form an orthogonal base). Intuitively, we can extrapolate this view to $\mathbb{R}^n$ and to $\mathbb{C}^n$.

But how can we make this more strict? How do we prove that, the linear operator $L(x) = Ax$, when restricted to $V_1$, can be represented by an $(n-1)\times (n-1)$ matrix, which is also hermitian?

Any references to other proofs of the orthogonality of the eigenvector basis of a hermitian matrix in the case of degenerated eigenvalues, will probably also help me.

  • Apply Gram-Schmidt to each eigenspace associated with a particular eigenvalue. It is automatic that an eigenvector $x$ associated with one eigenvalue $\lambda$ is automatically orthogonal to an eigenvector $y$ associated with a different eigenvalue $\mu$ (this follows because $A$ is Hermitian.) – Disintegrating By Parts Sep 09 '23 at 11:26
  • 1
    OK, I got it wrong. I thought that, since Gram-Schmidt procedure will alter the direction of the vectors, they will not be eigenvectors anymore. But this has been clarified here. – Aris Makrides Sep 09 '23 at 12:11
  • "(every square matrix has an eigenvalue and an eigenvector)". This is not always true. It is true if the matrix is defined over the complex field, because it is algebraically closed, but not necessarily true, if it is defined over the reals. However, it is true (over the real field), for Hermitian matrices. – Aris Makrides Sep 10 '23 at 10:10

2 Answers2

1

This is really easy if you consider $A$ as a linear transformation than just a matrix. We claim that for every finite dimensional inner product space $V$ of dimension $n$ and every hermitian transformation $A : V \to V$ there exists an orthogonal basis consisting of eigevectors of $A$. We do this by induction on $n$. It's trivially true for $n = 1$. Now suppose it's true for vector spaces of dimension $\leq n-1$. Take any one eigenvector $v$ with eigenvalue let's say $a$. Let $W$ be the subspace of $V$ orthogonal to $v$. If we could prove that $W$ is stable under the action of $A$ then by induction hypothesis it has orthogonal basis consisting of eigevectors of $A$. Add $v$ to it and you are done. Now to show that $W$ is $A$-stable, take any $w\in W$. We want to show $Aw\in W$ which is equivalent to saying that $Aw$ is orthogonal to $v$. Now by adjoint property of inner products $\langle v, Aw \rangle = \langle A^*v, w\rangle = \langle Av, w \rangle = \langle av, w \rangle = a \langle v, w\rangle = 0 $. We are done.

0

Let $\mathcal{H}$ be a finite dimensional Hilbert space. The matrix $A$ of a linear transformation $T: \mathcal{H} \to \mathcal{H}$ with respect to an orthonormal basis is Hermitian if and only if $$\langle Ty, x \rangle =\langle y, T x \rangle \tag{1} $$ for all $x,y \in \mathcal{H}$ (this is the better definition of Hermitian).

From this it is obvious that equation 1 also holds on any subspace, which is all we need for the proof. We do not need to use the matrix of the linear transformation at all throughout the proof.

If we have $V = \bigoplus_i V_i$ where the $V_i$ are the orthogonal eigenspaces of the Hermitian transformation, then we can apply Gram-Schmidt to each $V_i$ to obtain an orthogonal basis of each $V_i$. In this way a orthogonal basis of $V$ is obtained.

jd27
  • 2,145