I'm doing an exercise where I have a similar statement concerning linear transformations. More specifically: $\forall \sigma \in \mathcal{L} (V) (\tau \sigma = \sigma \tau) \implies \tau = a \iota$, where $\iota$ is the identity operator. Since every linear transformation with respect to a basis is equivalent to a matrix, I thought I could instead prove the statement in the title, that is, a square matrix which commutes with every other square matrix of same size must be a scalar multiple of $I$. I tried constructing matrices to assist me in the proof, so I wanted to be sure if the proof is actually valid.
The proof starts below:
Let $M_c (l, k) \in \mathcal{M}_n (F)$, where $n \geq 2$ be a matrix where $m_{l, l} = c$, $m_{l, k} = -c$, and for any other index $m_{i, j} = 0$. Also, $l \neq k$ and $c \neq 0$. Summarizing, the $n$th entry in the main diagonal has value $c$, and another entry in the same row has value $-c$, and all other entries have value $0$.
We intend to prove that any matrix $A \in \mathcal{M}_n$ which commutes with $M_a (l, k)$ for every $1 \leq l, k \leq n$ is a diagonal matrix. For the sake of readability, the matrix will be written only as $M$. Since both commute, for any $l$ and $k$, we have: \begin{equation} \begin{split} [B M]_{l, l} = \sum^n_{i = 1} b_{l, i} m_{i, l} = b_{l, l} m_{l, l} = c b_{l, l} \\ = [M B]_{l, l} = \sum^n_{i = 1} m_{l, i} b_{i, l} = b_{l, l} m_{l, l} + b_{k, l} m_{l, k} = c b_{l, l} + (- c b_{k, l}) \; . \end{split} \end{equation} We have that $- c b_{k, l} = 0$. Since $a \neq 0$, the same holds for its additive inverse. This implies that $b_{k, l} = 0$ for arbitrary $k$ and $l$ where $k \neq l$. We can choose every $k$ and $l$ between $1$ and $n$ and arrive at the same conclusion for these indexes, so we have that $b_{i, j} = 0$ for $i \neq j$, therefore $B$ must be a diagonal matrix, with all entries outside the main diagonal equal to $0$.
Now, we pick some matrix $C \in \mathcal{M}_n$ where $c_{l, k} \neq 0$ for $l \neq k$. Since $A$ commutes with $C$: \begin{equation} \begin{split} [A C]_{l, k} = \sum^n_{i = 1} a_{l, i} c_{i, k} = a_{l, l} c_{l, k} \\ = [C A]_{k, j} = \sum^n_{i = 1} c_{l, i} a_{i, k} = a_{k, k} c_{l, k} \; . \end{split} \end{equation} Since $c_{l, k} \neq 0$, this implies any two entries in the diagonal are equal, and since $l$ and $k$ are arbitrary, this applies for all entries. Since $A$ is also diagonal, we conclude that $A$ must be the multiple of an identity matrix.
I just wanted to be sure that I didn't make any mistakes and that these conclusions are enough to prove the statement. Is the proof correct?
Thanks in advance!