3

We have the standard proof that a symmetric matrix can be diagonalized even if it has eigen values with algebraic multiplicity via induction here. My boss came up with an alternate proof of this that might be a little simpler. Wanted to post here and see if people can help validate it.


If $A$ is an $n \times n$ real symmetric matrix, we can choose its eigen vectors to be orthogonal.

Suppose we have only $k<n$ such orthogonal eigenvectors $v_1,v_2 \dots v_k$. So, we complete the orthogonal basis for $\Bbb R_n$ by a different set of orthogonal vectors $u_1,u_2 \dots u_{n-k}$

Every vector in the span of $v_1,v_2 \dots v_k$ is orthogonal to every vector in the span of $u_1,u_2 \dots u_{n-k}$. Since $n>k$, no eigenvector of $A$ is in span $u_1,u_2 \dots u_{n-k}$ and we will show that this creates a contradiction.

First, we see that for every vector $v$ in span $v_1,v_2 \dots v_k$ and every vector u in span $u_1,u_2 \dots u_{n-k}$ we have

$$v^tAu = u^t A ^tv = u^tAv = u^t( c_1 \lambda_1 v_1 + c_2 \lambda_2 v_2 +\dots c_k \lambda_k v_k)=0$$

So, we see that $Au$ produces a vector that is orthogonal to span $v_1,v_2 \dots v_k$ and therefore in span $u_1,u_2 \dots u_{n-k}$. So we can define a linear transformation that does exactly what $A$ does but defined from span $u_1,u_2 \dots u_{n-k}$ to span $u_1,u_2 \dots u_{n-k}$.

Since the domain and range of this transformation are finite dimensional and are the same, we must have at least one eigenvector for this transformation and this eigenvector is in span $u_1,u_2 \dots u_{n-k}$. The eigen vector or value can't be complex since the original matrix is symmetric.

This eigenvector is also an eigenvector of $A$ and we got a contradiction. We must have $n=k$ and we have a full set of eigenvectors and $A$ is diagonalizable.

Rohit Pandey
  • 6,803
  • 1
    Your argument is relying on the following assertion: If $T:V\rightarrow V$ is linear and $W\subset V$ is finite dimensional and invariant under $T$, then $T|_W$ has at least one eigenvector. This, in general, is false. – Matthew H. Jan 15 '22 at 02:41
  • But won't the characteristic polynomial have at least one solution and hence at least one eigen value and eigen vector pair? – Rohit Pandey Jan 15 '22 at 02:47
  • 1
    If $V=\mathbb{R}^3$ and $\dim W=2$ then $T|_W$ could be a non trivial rotation in $W$ which wouldn't have any eigenvectors. You proof attempt may be able to be modified to take into account that the standard matrix $A$ of $T$ is symmetric, but in general what you're saying isn't true and that's the crux of your whole argument. – Matthew H. Jan 15 '22 at 02:52
  • I'm not sure I completely follow the notation, but I assume this means that in the remaining 2 dimensional space, $T$ represents a rotation. Even a rotation has eigen values and vectors, just that they are complex. In this case, it is impossible since the original $T$ was a symmetric matrix and so, all its eigen values must be real (and corresponding real eigen vectors must exist). – Rohit Pandey Jan 15 '22 at 02:57
  • 1
    I agree that all eigenvalues of $T$ are real since $T$ is symmetric, but I'm wondering if it's possible to prove that there is some ordered basis $\beta$ of $W$ so that the $(n-k)\times (n-k)$ matrix of $T|_W$ with respect to $\beta$ is symmetric. This would guarantee that $T|_W$ has a real eigenvalue. Here, $W$ is the span of your $u$ vectors. – Matthew H. Jan 15 '22 at 03:18
  • Isn't it true in this case that since $T$ is a symmetric linear transform on the whole of $R^n$, it might also be true on the subspace? – Rohit Pandey Jan 18 '22 at 19:58

0 Answers0