14

Let $M_n(\mathbb{C})$ denote the vector space over $\mathbb{C}$ of all $n\times n$ complex matrices. Prove that if $M$ is a complex $n\times n$ matrix then $C(M)=\{A\in M_n(\mathbb{C}) \mid AM=MA\}$ is a subspace of dimension at least $n$.

My Try:

I proved that $C(M)$ is a subspace. But how can I show that it is of dimension at least $n$. No idea how to do it. I found similar questions posted in MSE but could not find a clear answer. So, please do not mark this as duplicate.

Can somebody please help me how to find this?

EDIT: Non of the given answers were clear to me. I would appreciate if somebody check my try below:

If $J$ is a Jordan Canonical form of $A$, then they are similar. Similar matrices have same rank. $J$ has dimension at least $n$. So does $A$. Am I correct?

Extremal
  • 5,785

5 Answers5

4

Your approach is correct. Let $PMP^{-1}=J$ with invertible $P$ and Jordan form $J$. Then $$ \begin{align} AM=MA &\Longleftrightarrow PAMP^{-1}=PMAP^{-1} \\ &\Longleftrightarrow PAP^{-1} PMP^{-1} = PMP^{-1} PAP^{-1}\\ &\Longleftrightarrow PAP^{-1} J= J PAP^{-1}. \end{align} $$ Thus, we have $$ \phi_P : C(M) \rightarrow C(J) $$ given by $\phi_P(A) = PAP^{-1}$, is an invertible linear transformation with $\phi_P^{-1}(B)= P^{-1}BP$. So, $C(M)$ and $C(J)$ are isomorphic via the isomorphism $\phi_P$. Therefore, $C(M)$ and $C(J)$ have the same dimensions over $\mathbb{C}$.

Then if you prove that $C(J)$ has dimension at least $n$, then the same is true for $C(M)$ too.

Alternatively, as I commented last year, you can also use the general formula given in Centralizer of a Matrix. This gives the following info:

Let $\mathbb{F}$ be a field and $M\in M_n(\mathbb{F})$. Denote by $C(M)=\{A\in M_n(\mathbb{F}) | AM=MA\}$ the centralizer of $M$. The dimension of $C(M)$ over $\mathbb{F}$ is given by $$ \mathrm{dim}_{\mathbb{F}} C(M) = \sum_p (\mathrm{deg}(p))\sum_{i,j} \min (\lambda_{p,i}, \lambda_{p,j}), $$ where $p$ is any irreducible polynomial that divides the characteristic polynomial of $M$, and $\lambda_p= \sum \lambda_{p,i}$ is the exact power of $p$ in the characteristic polynomial of $M$. Here, $\lambda_{p,i}$ are the powers of $p$ in the primary decomposition of the $\mathbb{F}[x]$-module $\mathbb{F}^n$ in which $x$ acts by $M$-multiplication on the left.

By taking $i=j$ only in the double sum, we obtain that $$ \mathrm{dim}_{\mathbb{F}}C(M) \geq \sum_p (\mathrm{deg}(p)) \sum_i \lambda_{p,i} = n. $$

The equality occurs if and only if the minimal polynomial and the characteristic polynomial of $M$ coincide. Moreover, we have in this case, $$ C(M)=\{f(M) | f\in \mathbb{F}[x]\} = \mathrm{span}_{\mathbb{F}} \{ I, A, \ldots , A^{n-1}\}. $$

Sungjin Kim
  • 20,102
2

HINT: A square matrix $A$ over a field $F$ commutes with every $F$-linear combination of non-negative powers of $A$.

That is, for every $a_0$, $\dots$ ,$a_n \in F$,

$$A(\sum_{k=0}^n a_kA^k) = \sum_{k=0}^n a_k A^{k+1} = (\sum_{k=0}^n a_k A^k) A.$$

GAVD
  • 7,296
  • 1
  • 16
  • 30
  • 4
    But the subspace $\Bbb{C}[M]$ of $C(M)$ spanned by non-negative powers of $M$ may have dimension lower than $n$; it is in fact equal to the degree of the minimal polynomial, which need not be $n$. How do we guarantee that there are more rooms in $C(M)$? – Sangchul Lee Jul 31 '15 at 04:26
  • I am not very clear what you try to conclude from the commutativity. I thought you try to mean $A^k$ forms a basis, but maybe I am wrong. Can you please elaborate? – Empiricist Jul 31 '15 at 04:30
  • @SangchulLee: You are right dimension of the subspace $\mathbb{C}[M]$ equals to the degree of the characteristic polynomial of a matrix $M$. – GAVD Jul 31 '15 at 04:32
  • @GAVD, I mean, the minimal polynomial may have degree less than the characteristic polynomial as in the case $$ M = \begin{pmatrix} 1 & 0 & 0 \ 0 & 2 & 0 \ 0 & 0 & 2 \end{pmatrix}. $$ In this case $M^2 - 3M + 2I = 0$ and $\Bbb{C}[M]$ has dimension 2. – Sangchul Lee Jul 31 '15 at 04:35
  • @SRX: the idea is simple: the dimension of $C(M)$ is $n$ when the minimal polynomial and the characteristic polynomial of $M$ is coincide. Otherwise, the dimension of $C(M)$ is greater than $n$. – GAVD Jul 31 '15 at 04:36
  • 1
    I guess that we can apply this argument block-wise to the Jordan form of $M$ to reach the conclusion, since each block in the Jordan form is either scalar multiple of identity matrix or something whose minimal polynomial coincides with their characteristic polynomial. – Sangchul Lee Jul 31 '15 at 04:42
  • 1
    That maybe be over-complicated, but a density argument could work. Matrices with split characteristic polynomial are dense in $M_n (\mathbb{C})$, so approximate $A$ by a sequence of such matrices $(A_k)$. This argument shows that $\dim C(A_k) \geq n$, and by continuity any limit point of a sequence of matrices in $C(A_k)$ belongs to $C(A)$. – D. Thomine Aug 23 '19 at 14:25
1

Observe that it is enough to prove the statement for the Jordan blocks.

Let $J$ be an $n\times n$ Jordan block whose diagonal entries are $\lambda$, and let $U$ be the matrix whose entries are $1$ on the superdiagonal and $0$ elsewhere. Then $\{U^0, U^1, \dots, U^{n-1}\}$, where $U^0$ is the identity matrix, is a linearly independent set of $n$ elements in $C(J)$. So $\dim(C(J))\geq n$.


The linear independence is clear, and the commutativity can be justified in the following two ways:

  1. Note that for $1\leq k\leq n-1$, $U^k = (J-\lambda I)^k$ of course commutes with $J$.
  2. Direct computation shows that for $1\leq k\leq n-1$ $$ U^k J = JU^k = \begin{pmatrix} 0 & J_{n-k} \\ 0 & 0\end{pmatrix} $$ where $J_{n-k}$ is the $(n-k)\times(n-k)$ Jordan block whose diagonal entries are $\lambda$.

I did this problem with the second approach and, after reading GAVD's hint, realized that the first approach is the reason why we have the commutativity.

WLOG
  • 1,286
0

Let $K$ be a field, $A\in M_n(K)$ and $C(A)$ be the commutant of $A$. Clearly, if $A,B$ are similar, then $\dim(C(A))=\dim(C(B))$.

Definition. $U\in M_n(K)$ is said to be cyclic if there is $u\in K^n$ s.t. $\{u,Uu,\cdots,U^{n-1}u\}$ is a basis of $K^n$. Remark that, when $U$ is cyclic, the matrices $I,U,\cdots,U^{n-1}$ are linearly independent and, therefore, $\dim(C(U))\geq n$.

The key is the following proposition

Proposition. If $A\in M_n(K)$ is not cyclic, then there are two complementary proper subspaces of $K^n$ that are $A$-invariant.

Proof. Let $m_A=p_1^{u_1}\cdots p_k^{u_k}$ be the decomposition of the minimal polynomial of $A$ in irreducibles. If $k>1$, then, according to the kernels theorem, $K^n=\oplus_i \ker(p_i^{u_i})$ and we are done. Then we may assume that $m_A=p^u$ where $p$ is irreducible of degree $d$. If $\mathcal{B}$ is a basis of $K^n$, then there is $e\in\mathcal{B}$ s.t. $m_A$ is its minimal polynomial; thus $\{e,Ae,\cdots,A^{ud-1}e\}$ is a free system and spans $E_1$, the first $A$-invariant subspace. Note that, if $A$ is not cyclic, then $ud<n$.

EDIT. Now $A$ induces an endomorphism of $K^n/E_1$ which can be represented (as a first step) by $\pi\circ A_{|U}$ where $K^n=E_1\oplus U$ and $\pi$ is the associated projection onto $U$; let $m_1=p^v,v\leq u$ be its minimal polynomial; we consider a basis $\mathcal{B}_1$ of $\ker(m_1(A))$; by construction, there is $e_1\in \mathcal{B}_1$ s.t. $m_1$ is its minimal polynomial and s.t. $\{e_1,Ae_1,\cdots,A^{vd-1}e_1\}$ spans $E_2$, a second $A$-invariant subspace which is a direct sum with $E_1$; and so on...

Corollary. $\dim(C(A))\geq n$.

Proof. We proceed by recurrence over $n$. If $A$ is cyclic, then, according to the above remark, we are done. Otherwise, according to Proposition, there is $p\in \left\{1,\cdots,n-1\right\}$ s.t. $A$ is similar to a matrix in the form $B=\operatorname{diag}(U,V)$ where $U\in M_p(K),V\in M_{n-p}(K)$. Note that $C_1=\{\operatorname{diag}(X,Y)|X\in M_p(K),Y\in M_{n-p}(K),UX=XU,YV=VY \}$ is a subspace of $C(B)$. By the hypothesis of recurrence, $\dim(C_1)\geq p+n-p=n$ and we are done.

  • Nice approach, but I don't quite see what you mean by "we seek $e_1 \in \mathcal{B}_1 \setminus E_1$" (typos?). Anyway, you seem to be implicitly constructing the rational canonical form in this argument... – darij grinberg Aug 21 '19 at 21:12
  • @ darij grinberg , i) some elements of $E_1$ may have $m_1$ as minimal polynomial; then we must add the condition $e_1\notin E_1$. ii) the RCF is underlying because I wanted to show the result on any field $K$; of course, we can also show the result on an algebraic closure of $K$ (using the Jordan form) and then show that $dim(C(A))$ does not depend on the field containing the entries of $A$. –  Aug 23 '19 at 10:48
  • Okay, but then why will $E_2$ be a direct sum with $E_1$ ? Couldn't some linear combination of $A^i e_1$'s fall into $E_2$ even if $e_1$ itself does not? – darij grinberg Aug 23 '19 at 11:13
  • @darij grinberg , of course $e_1\notin E_1$ does not suffice to obtain the considered direct sum; yet, since $m_1$ is the minimal polynomial of the endom. induced by $A$ on $K^n/E_1$, there is $e_1$, with minimum polynomial $m_1$, s.t. the associated $E_2$ is a direct sum with $E_1$. –  Aug 23 '19 at 14:08
-1

In the case that $M$ has $n$ unique eigenvalues and eigenvectors, there is a fairly straightforward argument. For the eigenvalue $\lambda$, there are vectors $v_L$ and $v_R$ which are the left and right eigenvectors. Construct the $n\times n$ matrix $C_\lambda = v_L v_R^T$: $C_v$ is such that all its rows are scalar multiples of $v_R$ and all its columns are scalar multiples of $v_L$.

Therefore $M C_\lambda = C_\lambda M = \lambda C_\lambda$.

I'd expect to be able to make good headway working with generalised eigenvectors.

user24142
  • 3,732