We have the standard proof that a symmetric matrix can be diagonalized even if it has eigen values with algebraic multiplicity via induction here. My boss came up with an alternate proof of this that might be a little simpler. Wanted to post here and see if people can help validate it.
If $A$ is an $n \times n$ real symmetric matrix, we can choose its eigen vectors to be orthogonal.
Suppose we have only $k<n$ such orthogonal eigenvectors $v_1,v_2 \dots v_k$. So, we complete the orthogonal basis for $\Bbb R_n$ by a different set of orthogonal vectors $u_1,u_2 \dots u_{n-k}$
Every vector in the span of $v_1,v_2 \dots v_k$ is orthogonal to every vector in the span of $u_1,u_2 \dots u_{n-k}$. Since $n>k$, no eigenvector of $A$ is in span $u_1,u_2 \dots u_{n-k}$ and we will show that this creates a contradiction.
First, we see that for every vector $v$ in span $v_1,v_2 \dots v_k$ and every vector u in span $u_1,u_2 \dots u_{n-k}$ we have
$$v^tAu = u^t A ^tv = u^tAv = u^t( c_1 \lambda_1 v_1 + c_2 \lambda_2 v_2 +\dots c_k \lambda_k v_k)=0$$
So, we see that $Au$ produces a vector that is orthogonal to span $v_1,v_2 \dots v_k$ and therefore in span $u_1,u_2 \dots u_{n-k}$. So we can define a linear transformation that does exactly what $A$ does but defined from span $u_1,u_2 \dots u_{n-k}$ to span $u_1,u_2 \dots u_{n-k}$.
Since the domain and range of this transformation are finite dimensional and are the same, we must have at least one eigenvector for this transformation and this eigenvector is in span $u_1,u_2 \dots u_{n-k}$. The eigen vector or value can't be complex since the original matrix is symmetric.
This eigenvector is also an eigenvector of $A$ and we got a contradiction. We must have $n=k$ and we have a full set of eigenvectors and $A$ is diagonalizable.