1

Say A is an n by n matrix over the complex numbers so that A raised to the kth power is the identity I. How do we show A can be diagonalized?

Also, if alpha is an element of a field of characteristic p, how do we show that the matrix A=[1, alpha; 0, 1] satisfies A raised to the pth power equals the identity I and cannot be diagonalized if alpha is nonzero.

Please be detailed. I really have no idea how to start on this one.

koobtseej
  • 595

3 Answers3

7

Recall the following characterization of diagonalizable matricies. A matrix is diagonalizable over the field $F$ if and only if its minimal polynomial is a product of distinct linear factors over $F$.

Since $A$ has finite order $k$ we have $A^k = I$. This means that $A$ is a root to the polynomial $p(x) = x^k -1$. Since the minimal polynomial of $A$, $min_A$ must divide any polynomial for which $A$ is a root, we know $min_A | x^k -1$. Since I suppose working in $\mathbb{C}$, we know $x^k -1$ can be factored into $k$ monic linear factors. Thus $min_A$ is a product of monic linear factors, which implies that $A$ can be diagnolized.

6

If you have the machinery of Jordan forms and/or minimal polynomials you can settle the questions with those. When working over $\Bbb{C}$ user152558's answer points at a useful direction, and Pedro's answer shows that this won't work over the reals.

Lacking such technology I proffer the following argument. Remember that an $n\times n$ matrix (over a field $K$) can be diagonalized if and only if its eigenvectors span all of $K^n$ or, equivalently, if all the vectors in $K^n$ are linear combinations of eigenvectors.

Over $K=\Bbb{C}$ you can then do the following. Let $x\in K^n$ be arbitrary. Let $\omega=e^{2\pi i/k}$ be a primitive $k$th root of unity. The vectors $$ z_j=\frac1k(x+\omega^{-j}Ax+\omega^{-2j}A^2x+\cdots+\omega^jA^{k-1}x),\quad j=0,1,\ldots,k-1, $$ are then easily seen to be eigenvectors of $A$. Namely (You show this) $$ Az_j=\omega^jz_j.$$ Furthermore, because for all $j, 0<j<k$, we have $\sum_{t=0}^{k-1}\omega^{-jt}=0$ (You show this, too. Apply the formula for a geometric sum) we see that $$ x=z_0+z_1+\cdots z_{k-1}. $$ Therefore all the vectors of $K^n$ are linear combinations of eigenvectors, and we are done. Note: this argument works whenever the field $K$ has a primitive $k$th root of unity and when it is possible to divide by $k$, i.e. the characteristic of $K$ is not a factor of $k$ (actually the latter property follows from the first, but let's skip that).

Your other question follows from the same characterization of diagonalizability. Using the characteristic polynomial you see that $1$ is the sole eigenvalue. But when $\alpha\neq0$, the corresponding eigenspace is 1-dimensional. Thus not all vectors of $K^2$ are linear combinations of eigenvectors and we are done.

Jyrki Lahtonen
  • 133,153
  • Getting a strongish deja vu vibe, but didn't find a suitable duplicate, so I'm posting this for now. – Jyrki Lahtonen Dec 12 '14 at 14:23
  • Many of you will recognize discrete Fourier analysis (or representation theory of $C_k$) at work here. – Jyrki Lahtonen Dec 12 '14 at 14:36
  • man, you always come up with nice, different and simple explanation of things. – abel Dec 12 '14 at 14:59
  • @JyrkiLahtonen How do you show that $z_j$ is non-zero for some $x$? – Randall Nov 12 '23 at 16:22
  • @Randall It is perfectly possible that $z_j$ is zero for all $x$. For example, if $A=\omega I_n$, then it has only a single eigenvalue $\omega$, and $\omega^2,\omega^3$ never appear. Meaning that $z_1=x$ for all $x$, and $z_j=0$ for all $x$ when $j\neq1$. Every vector is still the sum of eigenvectors, because we can simply leave out the zero vectors. – Jyrki Lahtonen Nov 12 '23 at 20:58
0

It can be diagonalized over $\Bbb C$, but not always over $\Bbb R$. Consider a rotation matrix of $2\pi/3$ angles.

Pedro
  • 122,002