4

Theorem:

If an $n \times n$ matrix has n distinct eigenvalues then A is diagonalisable.

Proof:

Let $A \in \mathbb{R}^{n \times n}$. Suppose A is not diagonalisable. Then, by definition, for a given $D \in \mathbb{R}^{n \times n}$ there exists no invertible matrix $P \in \mathbb{R}^{n \times n}$ such that $P^{-1}AP = D$.

Any hint(s) to assist me would be helpful.

Thanks in advance.

JimmyK4542
  • 54,331

1 Answers1

4

Hint: I would prefer to prove it directly.

Eigenvectors for distinct eigenvalues are linearly independent. Thus we have a basis for $V$ consisting of eigenvectors. Simply let $P$ be the matrix whose columns are the basis vectors.

Then $P^{-1}AP=D$, where $D$ is diagonal, and the entries on the diagonal are the eigenvalues.

  • don't you need symmetry for linear independent eigenvectors? – LinAlg Dec 16 '18 at 03:09
  • 1
    No. See this: https://math.stackexchange.com/a/29374 –  Dec 16 '18 at 03:20
  • @ChrisCuster By construction, the j column of the matrix P is the j eigenvectors in the L.I set S of eigenvectors. Clearly, set S is the column space of P. I'd like to show that every column of P has a pivot position. Can you give me a hint on this? – Mathematicing Dec 16 '18 at 03:28
  • 1
    Hmm. Clearly the columns are independent, as they are the elements of a basis. To get pivots you would have to take the transpose and row-reduce, say. –  Dec 16 '18 at 03:32
  • @ChrisCuster Satisfied. – Mathematicing Dec 16 '18 at 03:35