9

"Eigenvectors corresponding to different eigenvalues are linearly independent."

My professor told us this during a lecture, but gave no proof or explanation.

jacob
  • 2,965

1 Answers1

11

Suppose that $Av=\lambda v$ and $Aw=\mu w$, $v,w\not=0$.

Assume $v,w$ are linearly dependent, then $v=c\cdot w$ for some scalar $c\not=0$. Then

$$\lambda v=Av=cAw=c \mu w = \mu v$$

That is, $\lambda=\mu$.

J.R.
  • 17,904
  • 1
    You surely mean dependent? – alex Mar 02 '14 at 15:47
  • @DanielFischer, alex Thank you! – J.R. Mar 02 '14 at 15:50
  • 2
    The theorem is more general then this. If you have $\lambda_1,\ldots,\lambda_m$ as distinct eigenvalues and $v_1,\ldots,v_m$ be corresponding eigenvectors, then $v_1,\ldots,v_m$ are linearly independent. – caffeinemachine Mar 02 '14 at 15:51
  • @caffeinemachine should be obvious how to generalize the technique in the answer... – gt6989b Mar 02 '14 at 16:06
  • 1
    @gt6989b I disagree but may be that's just me. – caffeinemachine Mar 02 '14 at 16:10
  • @caffeinemachine here's one way to do so. I agree that there doesn't seem to be a particularly "obvious" approach. – Ben Grossmann Mar 02 '14 at 17:39
  • 5
    @caffeinemachine Maybe I don't understand something. If you let $v_1$ be a linear combination of the others, i.e. $$v_1 = \sum_{k=2}^m c_k v_k$$ then $$\sum_{k=2}^m c_k \lambda_k v_k = A \sum_{k=2}^m c_k v_k = A v_1 = \lambda_1 v_1 = \lambda_1 \sum_{k=2}^m c_k v_k = \sum_{k=2}^m \lambda_1 c_k v_k$$ so $$\sum_{k=2}^m \lambda_1 c_k v_k = \sum_{k=2}^m c_k \lambda_k v_k$$ which implies that $\lambda_1 c_k = c_k \lambda_k$ for all $k$, since $v_k$ are linearly independent, and so either $c_k \equiv 0$ or $\lambda_1 = \lambda_k \quad \forall k$. – gt6989b Mar 02 '14 at 17:51
  • @gt6989b You are right. That does make it seem obvious. The way I did it was quite convoluted compared to this. – caffeinemachine Mar 02 '14 at 18:27
  • @gt6989b two things:

    a) I don't get how you go from $\Sigma c_k v_k \lambda_k$ to $A \Sigma c_k v_k$. Can you explain that? b) What is the "conclusion", I don't understand how this proves it.

    – jacob Mar 07 '14 at 20:14
  • @jacob $v_k \lambda_k = A v_k$ and so $$\sum c_k v_k \lambda_k = \sum c_k A v_k = \sum A c_k v_k = A \sum c_k v_k$$ since matrix multiplication commutes with constant multiplication. – gt6989b Mar 10 '14 at 12:10
  • @jacob If all $c_k \equiv 0$, then $v_1 = \vec{0}$, so this cannot happen. Alternatively, if $\lambda_1 = \lambda_k \forall k$, then the eigenvectors $v_k$ turn out to correspond to the same eigenvalue. – gt6989b Mar 10 '14 at 12:12
  • Oh wow, this is nicer than the usual induction proof. – Nishant Dec 09 '14 at 06:12
  • Brilliant Proof! – Dude156 Oct 19 '20 at 20:17