8

I'm trying to learn linear and abstract algebra on my own and have been attempting textbook exercises and problem sets I find online. I've been doing okay so far but I found this problem and I'm having a lot of trouble with it:

Let $A$ be an $n \times n$ complex matrix.

a) Show that the set of matrices commuting with $A$ is a subspace.

b) What is the dimension of this subspace?

I think I got the first part. It wasn't that bad. But I'm having trouble with the second part. I feel like this is supposed to be an easy question, but I just don't know how to start it.

I was thinking about using Jordan form somehow. If $A$ ~ $J_A$ and $B$ ~ $J_B$, is it true that if $J_A J_B = J_B J_A$ then $AB = BA$? If it is, then we'd only have to look at the Jordan blocks of these and see when those commute with each other. Then the problem wouldn't be so bad, I think.

I'd love some hints.

  • Perhaps this may be helpful? Though it concerns a more specific case. – White Shirt Mar 20 '15 at 04:56
  • Supposing $B$ commutes with $A$, write out the matrix representation of each and compute $AB = BA$ in terms of their $[a_{ij}]$ and $[b_{ij}]$ representations. This will probably yield a constraint on the values that the different $b_{ij}$ will have so there will probably be fewer than $n^2$ independent entries. – Mnifldz Mar 20 '15 at 04:58
  • 3
    the dimension will depend a lot on $A.$ for simple examples if $A = kI$ the scalar matrix, then all matrices commute with this. – abel Mar 20 '15 at 05:07
  • 2
    No, commutativity of Jordan forms does not imply commutativity of matrices: For any matrix with distinct eigenvalues, the Jordan form is diagonal, and any two diagonal matrices commute. Since diagonalizable matrices are dense in the set of all $n \times n$ matrices, the commutator $(A, B) \mapsto AB - BA$ would hence be zero on a dense subset of matrices and by continuity zero everywhere. But, of course, not all matrices commute, so this cannot be the case. – Travis Willse Mar 20 '15 at 05:10
  • 2
    On the other hand, the dimension of the space of matrices that commutes with a given matrix is invariant under similarity, so you may as well assume that $A$ is in Jordan Normal Form. – Travis Willse Mar 20 '15 at 05:27
  • @Travis please give me a hint on diagonalizable matrices are dense in the set of all n×n matrices. – Sry Mar 20 '15 at 06:20
  • @Sry In fact, matrices with distinct eigenvalues are already dense. You can see that by adding a small noise to all entries. This results in some noise being applied to the characteristic polynomial, and makes it very likely (in fact, almost surely if you choose a non-degenerate noise distribution) that its roots are distinct. – Yuval Filmus Mar 20 '15 at 06:23
  • @Sry In case it's not clear, the point of the comment was simply that one cannot assume both matrices are in Jordan form. There are several arguments; one intuitive way to do this is to take a nondiagonalizable matrix, which we may as well put in Jordan form, then perturb the diagonal entries in a way that the diagonal entries are all distinct (and we can always achieve this with an arbitrarily small perturbation). The resulting matrix is upper triangular, hence its eigenvalues are its diagonal entries. But these are all distinct by construction, so the resulting matrix is diagonalizable. – Travis Willse Mar 21 '15 at 08:22
  • (In order to use this method, one needs to show that one has some rudimental control over the change in norm imposed by the similarity transformation to Jordan form, but this isn't so bad if you're used to working with matrix norms.) – Travis Willse Mar 21 '15 at 08:24

2 Answers2

2

As Travis mentions, you can assume without loss of generality that $A$ is in Jordan form. The simplest case is that $A$ is diagonal, say with elements $\lambda_1,\ldots,\lambda_n$ on the diagonal. Let $B = (b_{ij})$ be a matrix. Then $B$ commutes with $A$ if for all $i,j$, $\lambda_i b_{ij} = \lambda_j b_{ij}$, as a simple calculation shows. It is instructive to write this as $(\lambda_i - \lambda_j) b_{ij} = 0$, which implies that either $\lambda_i = \lambda_j$ or $b_{ij} = 0$.

Suppose now that the distinct eigenvalues are $\mu_1,\ldots,\mu_m$, and that $\mu_k$ occupies the positions in $I_k \subseteq \{1,\ldots,n\}$. If $i,j \in I_k$ then the condition above $(\mu_k - \mu_k) b_{ij} = 0$ always holds, whereas if $i \in I_k$ and $j \in I_\ell$ for $j \neq k$ then the condition $(\mu_k - \mu_\ell) b_{ij} = 0$ implies that $b_{ij} = 0$. As a consequence, we deduce that the dimension is exactly $\sum_{k=1}^m |I_k|^2$.

I'll leave you the more general case.

Yuval Filmus
  • 57,157
  • Nice work for the diagonal matrices, but I don't think the general case would be as easy as this. – Bach Jul 25 '19 at 07:59
1

I collected several equivalent conditions at Given a matrix, is there always another matrix which commutes with it?

about the following item, this is probably the simplest way to put it. Every square matrix has a characteristic polynomial and a minimal polynomial.

Next, given a matrix $A,$ we know that $A$ commutes with $I,A, A^2, A^3,$ indeed $A$ commutes with any polynomial in $A.$ By Cayley-Hamilton, such a polynomial can always be re-written as $$ a_0I + a_1 A + a_2 A^2 + \cdots + a_{n-1} A^{n-1} $$

THEOREM: if the characteristic and minimal polynomials of $A$ are the same, then any matrix that commutes with $A$ can be written as a polynomial in $A.$ The set of those is of dimension $n.$

If the minimal polynomial is different from the characteristic polynomial, the dimension goes up, because all polynomials in $A$ still commute with $A,$ but now there are more. It has already been mentioned that when $A=I,$ everything commutes with it, so the dimension is large, $n^2.$ Here is a middle case you should do by hand: what matrices commute with $$ A = \left( \begin{array}{ccc} 1 & 0 & 0 \\ 0 & 2 & 0 \\ 0 & 0 & 2 \end{array} \right) ? $$

What I mean by doing by hand is to write out $AB$ and $BA,$ with $$ B = \left( \begin{array}{ccc} a & b & c \\ d & e & f \\ g & h & i \end{array} \right) $$ and find all conditions necessary about the nine variables $a,b,c,d,e,f,g,h,i.$ These will be linear equations, overall a linear system.

Will Jagy
  • 139,541