I'm aware of
- How to prove that eigenvectors from different eigenvalues are linearly independent
- Finite sum of eigenspaces (with distinct eigenvalues) is a direct sum
- Proof of "Eigenvectors corresponding to different eigenvalues are linearly independent."
- Eigenvectors are linearly independent?
and likely more. However, I'd like to check the following proof; I'll gladly delete and/or post a different answer to the non-duplicate one of those if desired.
Theorem: with $\vec a,\vec b$ eigenvectors of $A$ corresponding to distinct eigenvalues $x,y$, we have $\vec a$ and $\vec b$ are linearly independent.
Proof: we have $A\vec a=\vec ax$ and $A\vec b=\vec by$ where $x-y\neq0$. Hence
$$\vec a·\vec b=\frac{x-y}{x-y}\;(\vec a·\vec b)=\frac{x(\vec a\cdot\vec b)-y(\vec b\cdot\vec a)}{x-y}=\frac{(A\vec a)\cdot\vec b-(A\vec b)\cdot\vec a}{x-y}=\frac{A(\vec a\cdot\vec b-\vec b\cdot\vec a)}{x-y}=0$$ which implies linear independence.
As an aside, I'm curious if a step such as $\vec a\cdot(A\vec b)=(\vec a A)\cdot\vec b$ is valid; I was originally experimenting with such expressions. I was interpreting matrix-vector multiplication as commutative, so we would derive $\vec a·(A\vec b)=(A\vec a)·\vec b$. Is this a valid identity?