2

Whats the best approach to solve those two tasks? $A$ is an $n\times n$ Matrix over $\mathbb{C}$

  1. If $A^5 = A$, then $A$ is diagonalizable
  2. $A - \overline A^{T}$ is diagonalizable
InsideOut
  • 6,883
  • you asked for the "best approach". does this mean you already have a solution and you're looking for a quicker/better solution? – peek-a-boo Jul 20 '19 at 05:40
  • No, I am looking for some general ideas – Quantaurix Jul 20 '19 at 05:43
  • the reason I asked is because I was going to suggest the exact same answer which has been provided, but I was unsure whether or not you have been exposed to the idea of minimal polynomials, and the spectral theorem (I think the answer provided is one of the quickest ways of solving, but it does require some amount of machinery) – peek-a-boo Jul 20 '19 at 05:45
  • spectral theorem is no problem for unitary matrices, symmetric matrices etc. But yes, I do not know about minimal polynomials. This task is taken from a professors "excercise catalogue". It may be the case that this task was originally stated when minimal polynomials were treated in class. – Quantaurix Jul 20 '19 at 05:48

1 Answers1

6

If minimal polynomial of $A$ has distinct roots, then $A$ is diagonalizable. (This is a standard theorem in linear algebra.) So 1 is true since $x^{5} -x = x(x-1)(x+1)(x-i)(x+i)$ has distinct roots and the minimal polynomial of $A$ should divide this.

For 2, you don't need the condition $A^{5} = A$. The matrix $B = A -\bar{A}^{T}$ is skew-Hermitian (which satisfies $\bar{B}^{T} = -B$, so that $C = iB$ is Hermitian, which are always diagonalizable, so is $B = -iC$.


Without mentioning anything about minimal polynomial, let's try to prove it directly. (I think this proof may work for any matrices with distinct eigenvalues)

Let's define $V_{\lambda} = \{x\in \mathbb{C}^{n}\,:\, Ax = \lambda x\}$, eigenspace of $A$ for the eigenvalue $\lambda\in \{0, \pm 1, \pm i\}$. Then diagonalizability of $A$ is equivalent to $$ \mathbb{C}^{n} = \bigoplus_{\lambda} V_{\lambda} $$ For distinct $\lambda \neq \lambda'$, it is easy to check that $V_{\lambda} \cap V_{\lambda'} = \{0\}$. Hence we only need to show that $\sum_{\lambda} V_{\lambda} = \mathbb{C}^{n}$.

For any given $v \in \mathbb{C}^{n}$, define $$ v_{0} = v - A^{4}v\\ v_{1} = Av +A^{2}v + A^{3}v +A^{4}v\\ v_{-1} = Av - A^{2}v + A^{3}v - A^{4}v\\ v_{i} = Av - iA^{2}v - A^{3}v + iA^{4}v \\ v_{-i} = Av + iA^{2}v - A^{3}v - iA^{4}v. $$ Then $v_{\lambda}\in V_{\lambda}$ by direct computation with $A^{5} = A$, and $$ v= v_{0} + \frac{1}{4} (v_{1}- v_{-1} -iv_{i} + iv_{-i}) $$ proves the desired claim.

Seewoo Lee
  • 15,137
  • 2
  • 19
  • 49
  • 1
    I don't get your solution to 1. Are you claiming $A^5 = A$ implies $A$ has distinct eigenvalues? What about $A = I$? – mathworker21 Jul 20 '19 at 05:41
  • 1
    Thanks, but in class we have not learned about minimal polynomials, althought I saw this quite often. Is there another approach ? – Quantaurix Jul 20 '19 at 05:42
  • @mathworker21 I think it was just poor choice of wording. It should say "if the minimal polynomial of $A$ splits over $\Bbb{C}$ and all its roots have algebraic multiplicity $1$" – peek-a-boo Jul 20 '19 at 05:43
  • @mathworker21 It is a typo - I mean root, not eigenvalues. Thank you. – Seewoo Lee Jul 20 '19 at 05:43
  • @Quantaurix I added another solution without using the theorem. Check it out! – Seewoo Lee Jul 20 '19 at 05:59
  • @SeewooLee Thanks! Can I ask how you derived the formulas for the $v_{1,2,3,4}$ ? There is no way I came up with this idea. Is it just that you need to incorporate all 5 eigenvectors, the first being $v_0$, so there are 4 left. Using the "cyclic" property of the sum $Av + ...+ A^4v$ you could satisfy both the eigenvalue condition and that all factors cancel out except for $A^4$. I would need to a lot of trial and error to get this right. – Quantaurix Jul 20 '19 at 06:26
  • @Quantaurix It is hard to explain, but as you said, there's something "cyclic" going on here. In general, if the equation was $A^{n} = I$ (not $A^{n} = A$, just for convenience), then we can consider vectors like $$v_{j} = \sum_{k} \zeta_{n}^{kj} A^{k}v$$ I think such idea also can be generalized to any matrix with distinct eigenvalues, but in that case I'm not sure how to find an appropriate $v_{j}$s. (Also, I edited the answer from $v_{1, 2, 3, 4}$ to $v_{1, -1, i, -i}$.) – Seewoo Lee Jul 20 '19 at 07:40
  • It's been a long time to learn about linear algebra so I could remember something wrong, but maybe you can try to check cyclic decomposition theorem. – Seewoo Lee Jul 20 '19 at 07:43