1

Prove that if the $n$ eigenvalues of a matrix $A_{n\times n}$ are distinct then

1) there are $n$ eigenvectors {$\bar c_i$}), one corresponding to each of those eigenvalues and

2) that {$\bar c_1,\bar c_2, \dots,\bar c_n$} are all linearly independent.

Aditya P
  • 578
  • 7
  • 22

1 Answers1

1

I will seek to prove that if the $n$ eigenvalues of a matrix $A_{n\times n}$ are distinct then 1) there are $n$ eigenvectors {$\bar c_i$}), one corresponding to each of those eigenvalues

Definition: the algebraic multiplicity of an eigenvalue $a(\lambda_i)$ is the power to which $(λ – \lambda_i)$ divides the characteristic polynomial.

So if a eigenvalue $\lambda_j$ repeats $k$ times then $a(\lambda_j) = k$.

So in the case of distinct eigenvalues ${\lambda_1,\ldots,\lambda_n}$, we can conclude that for each of the $n$ eigenvalues, the algebraic multiplicity is 1.

$$a(\lambda_i) = 1 \space \space \space \forall \space \space\lambda_i$$

Now what will the geometric multiplicities of these eigenvalues be?

Definition: The geometric multiplicity of an eigenvalue, $g(λ_i)$ is the dimension of the eigenspace $E_{λ_i} = N(A−λ_iI)$ corresponding to $λ_i$. Here N finds the null space of $A−λ_iI$.

The eigenspace $E_{λ_i}$ is best understood as the vector space spanned by all the eigenvectors that correspond to the eigenvalue ${λ_i}$, i.e. the collection of all vectors $\bar v$ that satisfy $A\bar v = λ_i\bar v$ form the eigenspace.

An eigenspace has dimension greater than zero by definition. Since the definition of an eigenvalue is : λ is an eigenvalue of A if $Ax=λx$ for some $x≠0$. Since only the zero vector by itself has a dimension of zero,we can conclude $0 < g(λ_i)$ or $1 ≤ g(λ_i)$.

This is a brilliant proof for why $g(λ_i)≤a(λ_i)$, it shows that the characteristic polynomial will have $(\lambda - λ_i)^{g(λ_i)}$ at least as a factor.

$$1 ≤ g(λ_i) ≤a(λ_i)$$

Thus in the case of distinct eigenvalues,

$$1 ≤ g(λ_i) ≤ 1 \space \space \space \forall \space \space\lambda_i$$

$$g(λ_i) = 1 \space \space \space \forall \space \space\lambda_i$$

$g(λ_i)$ is also equivalently the number of independent eigenvectors associated with $λ_i$, since if we want to span a vector space (the eigenspace) of dimension $g(λ_i)$, we need that many independent eigenvectors.

Thus we have proved that associated with each distinct eigenvalue is a single eigenvector.

Aditya P
  • 578
  • 7
  • 22
  • 1
    Answering your own question is acceptable here. But that usually happens only if when you first asked you didn't know an answer, and you only found one after quite a while. Here you seem to have posted your answer right after asking. Why? If you are unsure of your proof you can put the proof in your question, flag the place where you have doubts, and use the proof-verification tag. – Ethan Bolker Jul 21 '18 at 20:04
  • I only answered part 1, I actually wanted to answer this question - https://math.stackexchange.com/questions/29371/how-to-prove-that-eigenvectors-from-different-eigenvalues-are-linearly-independe using the method I gave above. That is why I posted half the solution. I hope to develop the answer later and provide a geometric intuition for part 2 of the question. – Aditya P Jul 21 '18 at 20:11
  • That question is seven years old and already has several really nice answers. – Ethan Bolker Jul 21 '18 at 20:25
  • Oh, let me go through them again then. I didn't manage to grasp the inution for why distinct eigenspaces imply independence while not the converse. While the solutions are algebraically elegant. I wanted to visualise something like how each of the n eigenspaces are restricted to a line, we should be able to say that all n Eigenvectors are linearly independent, since together they should span $F^n$? – Aditya P Jul 21 '18 at 20:40