Let $A$ be a fixed $n\times n$ matrix over a field $F$. We can look at the subspace $$W=\{X\in M_{n,n}(F); AX=XA=0\}$$ of the matrices which fulfill both $AX=0$ and $XA=0$.
Looking a these equations we get that all columns of $X$ have to fulfill the equation $A\vec c=\vec 0$. (Let us say we're working with column vectors.) Similarly we get for the rows $\vec r^T A=\vec 0^T$. This tells us that if we are looking at the possible choices for columns/rows of the matrix $X$, they have to be in a subspace of dimension $n-\operatorname{rank}A$ (in the right/left null space of $A$).
At least in some cases it is almost immediately possible to find $W$ or at least $\dim W$.
- Obviously, if $A$ is invertible, then $W=\{0\}$ and $\dim W=0$.
- Another trivial case is when $A=0$, which gives us $W=M_{n,n}$ and $\dim W=n^2$.
- Slightly less trivial but still simple case is when $\operatorname{rank} A=n-1$. In this case the condition on rows/columns give us one-dimensional spaces, so there are non-zero vectors $\vec r$, $\vec c$ such that each row has to be multiple of $\vec r^T$ and each column has to be a multiple of $\vec c$. Up to a scalar multiple, there is only one way how to get such a matrix and we get that $W$ is generated by the matrix $\vec c\vec r^T$ and $\dim W=1$.
The general case seems to be a bit more complicated. If we denote $k=n-\operatorname{rank}A$, we can use the same argument to see that there are $k$ linearly independent vectors $\vec c_1,\dots,\vec c_k$ such that the columns have to be linear combinations of these vectors. Similarly, row can be chosen only from the span of the linearly independent vectors $\vec r_1,\dots,\vec r_k$. (This is again just a direct consequence of $A\vec c=\vec 0$ and $\vec r^TA=\vec 0^T$.)
Using these vectors we can get $k^2$ matrices $$A_{ij}=\vec c_i \vec r_j^T$$ for $i,j\in\{1,2,\dots,k\}$. Unless I missed something, it seems that showing that these matrices are linearly independent is not too difficult. So we should get that $$\dim W \ge k^2 = (n-\operatorname{rank}A)^2.$$ It is not obvious to me whether these vectors actually generate $W$. (And perhaps something can be said about the dimension of $W$ without exhibiting a basis.)
You may notice that in the three trivial examples above (with $k=0,1,n$) we got the equality $\dim W=(n-\operatorname{rank}A)^2$.
Another possible way to look at this problem could be to use the linear function $$f\colon X\to(AX,XA)$$ $f\colon M_{n,n} \to M_{n,n}\oplus M_{n,n}$, then we have $W=\operatorname{Ker} f$, so we are basically asking for the dimension of the kernel of this map. So to find $\dim W$ it would be sufficient to find $\dim\operatorname{Im} f$. However, this does not seem to be easier than the original formulation of the problem.
It is also possible to see this as a system of $n^2$ linear equations with $n^2$ unknowns $x_{11}, x_{12}, \dots, x_{nn}$. If we try to use this line of thinking, the difficult part seems to be determining how many of those equations are linearly dependent.
Question: What can be said about the dimension of the subspace $W$? Is it equal to $(n-\operatorname{rank}A)^2$? Is it determined just by the rank of $A$? If not, what are best possible bounds we can get, if we know only the rank of $A$ and have no further information about $A$?
Motivation for this question was working on an exercise which asked for calculating dimensions of spaces $W_1$, $W_2$, $W_1\cap W_2$ and $W_1+W_2$, where the spaces $W_1$ and $W_2$ were determined by the conditions $AX=0$ and $XA=0$, respectively. Since the matrix $A$ was given, in this exercise it was possible to find a basis of $W_1\cap W_2$ explicitly. (And the exercise was probably intended just to make the students accustomed to some basic computations such as finding basis, using Grassmann's formula, etc.) Still, I was wondering how much we can say just from knowing the rank of $A$, without going through all the computations.