0

How can I prove that over algebraically closed field $k$ any invertible matrix of finite order is semi-simple (diagonalizable)?

I thought in the following direction: any polynomial has root in $k$ since it is algebraically closed $\Rightarrow$ characteristic polynomial splits into linear factors $\Rightarrow$ there exists basis of eigenvectors $\Rightarrow$ matrix is diagonalizable.

It isn't true and counterexample is very simple: $\begin{pmatrix} 1 & 1\\ 0 & 1 \end{pmatrix}$

So how should I use that it is of finite order?

Hasek
  • 2,335

1 Answers1

3

I hope finite order means that $A^k=I$ for some $k$. Consider the polynomial $p(X)=X^k-1$. Notice that $p(A)=0$. It follows that the minimal polynomial of $A$ is a polynomial dividing $X^k-1$. But the polynomial $X^k-1$ splits into distinct linear factors, thus the minimal polynomial does as well. It follows that $A$ is diagonalizable.

See for example theorem 4.11 in this text

  • 1
    First: Don't you think the OP should think a little for themselves? They will not learn much this way. Second: The point is that there are no repeated linear factors in the minimal polynomial. That's what makes $A$ diagonalizable. – Friedrich Philipp Mar 15 '17 at 21:37
  • 1
    First of all, you added that bit on the minimal polynomial later into your comment, so I missed it when I already wrote the answer. Secondly I don't think the above reasoning was much of an exercise given that one knows the link between diagonalizability and minimal polynomials. Either the OP doesn't know about it, or he didn't think about it. Judging by the OP's question I believe he already spent some time on the problem and could benefit from an answer. – Mathematician 42 Mar 15 '17 at 21:44
  • In my opinion, mentioning that $p(A) = 0$ would have been enough. But nevermind. – Friedrich Philipp Mar 15 '17 at 21:54