8

This is part $2$ of the question that I am working on.

For part $1$, I showed that the space of $5\times 5$ matrices which commute with a given matrix $B$, with the ground field = $\mathbb R$ , is a vector space.

But how can I compute its dimension?

Thanks,

3 Answers3

5

Assume $B$ is in Jordan normal form (real, the complex case is left as exercise). Assume the block $J_i = \lambda_i I + S$ is $n_i \times n_i$ for $1\le i \le p$, where $I$ is the identity and $S$ the shift matrix of suitable size. Then, the space of matrices which commutes with $B$ has dimension $$ \sum_{\substack{i,j=1 \\ \lambda_i = \lambda_j}}^p \min(n_i, n_j). $$

(I hope I found all cases.)

Notes and Proof(outline):

For $k\in\mathbb N$ let $I_k$ denote the $k\times k$ identity matrix and let $S_k$ denote the $k\times k$ shift matrix.

  1. Notice that the operator $C_B(A) = AB - BA$ can be written as matrix vector product using the Kronecker product and the vectorization operator $\operatorname{vec}$: $$ \operatorname{vec} C_B(A) = ((B^T \otimes I) - (I \otimes B)) \operatorname{vec}(A). $$ So, for an explicitly given $B$ you can compute the rank easier.

  2. Also notice that for $B = QJQ^{-1}$, where $Q$ is non-singular, we have $$ \ker C_B = Q\ker C_J Q^{-1}. $$ So we may assume Jordan normal form w.l.o.g.

  3. Let $A\in\mathbb R^{m\times n}$. In case of $m \le n$, by induction, we obtain $$ S_m A = A S_n $$ if any only if $$ A = \begin{bmatrix} 0 & \bar A \end{bmatrix} $$ for some $$ \bar A \in \operatorname{span}\{ I, S_m, S_m^2, \dotsc, S_m^{m-1} \}. $$ Similarly, in case of $m \ge n$, by induction, we obtain $$ S_m A = A S_n $$ if any only if $$ A = \begin{bmatrix} \bar A \\ 0 \end{bmatrix} $$ for some $$ \bar A \in \operatorname{span}\{ I, S_n, S_n^2, \dotsc, S_n^{n-1} \}. $$

  4. For $B = \lambda I_n + S_n\in\mathbb R^{n\times n}$, where $\lambda\in\mathbb R$, we have $$ \ker C_B = \ker C_S = \operatorname{span}\{ I_n, S_n, S_n^2, \dotsc, S_n^{n-1} \}. $$

  5. Let $A = (A_{i,j})_{1\le i,j \le p}$ be a block matrix and $B = \operatorname{diag}(B_1,\dotsc, B_p)$ a block diagonal matrix w.r.t. the same partition and $B_i = \lambda_i I_{n_i} + S_{n_i}$. Then, we have $C_B(A) = 0$ if and only if $$ 0 = A_{i,j} B_j - B_i A_{i,j} = (\lambda_j - \lambda_i) A_{i,j} + (A_{i,j} S_{n_j} - S_{n_i} A_{i,j}) $$ for $1 \le i \le j \le p$. That is, if $A_{i,j}\ne 0$, then $-(\lambda_j - \lambda_i)$ is an eigenvalue of the operator $X\mapsto C_{S_{n_j}, S_{n_i}}(X) := XS_{n_j} - S_{n_i} X$. However, that operator has only the eigenvalue $0$.

user251257
  • 9,229
5

There is a theorem by Frobenius:

Let $A\in {\rm M}(n,F)$ with $F$ a field and let $d_1(\lambda),\ldots,d_s(\lambda)$ be the invariant factors $\neq 1$ of $\lambda-A$, let $N_i=\deg d_i(\lambda)$. Then the dimension of the vector space of matrices that commute with $A$ is $$N=\sum_{j=1}^s (2s-2j+1)n_j$$

Pedro
  • 122,002
4

Alternative approach. The space you are looking at is the kernel of the commutator map at $B$ $$\mathrm{Mat}_{5×5}(ℝ) → \mathrm{Mat}_{5×5}(ℝ),~A ↦ AB - BA,$$ which is $ℝ$-linear. You can easily compute the rank of this map by throwing in a basis of $\mathrm{Mat}_{5×5}(ℝ)$ and see what dimension the space is that their images generate. Then you can use the rank–nullity theorem. So it comes down to computing the dimension of the linear hull of some system of matrices.

k.stm
  • 18,539
  • Hi @k.stm, I'm a little confused -- do you mean that I should compute the null space of AB-BA, so try to solve (AB-BA)x=0 for all x? If so, how does that help, when we are really trying to find all matrices A such that AB = BA? Thanks, – User001 Oct 12 '15 at 07:30
  • 2
    @LebronJames No, forget for a second that the matrices in $\mathrm{Mat}{5×5}(ℝ)$ represent linear maps themselves. Just write $V = \mathrm{Mat}{5×5}(ℝ)$ and view it only as a vector space. You are interested in the kernel of the endomorphism $V → V,~A ↦ AB - BA$ (since the elements in it are exactly the matrices $A$ which fulfill $AB - BA = 0$, that is $AB = BA$). Compute the rank of this endomorphism and use the rank–nullity theorem to get to the dimension of its kernel. – k.stm Oct 12 '15 at 07:44
  • 1
    Hmm... @k.stm, just curious, is this utilizing something outside / more advanced than linear algebra? Is it abstract algebra stuff? Or, would you say this is pretty standard technique? I have never heard of an endormorphism. I will definitely look it up now though - probably start with Wikipedia. Thanks, – User001 Oct 12 '15 at 07:51
  • 1
    @LebronJames It is standard linear algebra, even the word “endomorphism” is used a lot in linear algebra as well. Here it is just a fancy word for a linear map of a vector space to itself (so $V → V$ rather than $V → W$ for some different vector space $W$). – k.stm Oct 12 '15 at 08:28
  • 2
    Is there a simple answer for the general case? Even for a symmetric $B$ there is a stark difference between say $B = \operatorname{Id}$ and $B=\operatorname{diag}(1,2,3,4,5)$. – user251257 Oct 12 '15 at 14:14
  • Ok, got it -- thanks for your answer and additional comments @k.stm :-) – User001 Oct 13 '15 at 00:52
  • How would you compute the dimension of the image space? I find it not that simple. – Bach Jul 25 '19 at 06:48
  • @Bach As told in my answer, you throw in the standard basis for the $2×2$-matrices. Computing the dimenison for the linear hull of some given vectors is a standard task in linear algebra. – k.stm Jul 25 '19 at 07:32
  • @k.stm Why $2\times 2$? Is that a typo? But I have tried to compute the image of basis and it turns out to be very complicated... Do you have any idea how to simplify the computation? Otherwise, it may be the same difficulty as to directly compute the null space. – Bach Jul 25 '19 at 07:57
  • @Bach Yeah, I meant $5×5$. If $B$ is diagonalizable or admits any other nice normal form, you may use this instead of $B$, as the kernel of the commutator stays the same after conjugation. That might be nicer to compute. It depends on your matrix $B$, really. – k.stm Jul 25 '19 at 09:56
  • @k.stm I see. I have the same idea as yours(to apply the rank-nullity theorem), but it highly depends on the matrix $B$ +1. – Bach Jul 25 '19 at 10:51