0

Show that the scalar matrix $\lambda E$, with $E$ being the identity matrix and $\lambda \in K$ a constant, is the only matrix $A\in K^{n\times n}$ that commutes with all invertible matrices.

I am not sure how to tackle this question. I would appreciate if you could help me. Thanks!

Junjiro
  • 187
  • What have you written down, explicitly? – Ted Shifrin Jan 15 '21 at 21:48
  • Thanks! I saw that question before posting this. My problem is that I think this is a different case scenario as it is about invertible matrices: I suppose I have to show uniqueness and existence of this claim. This is a comment in reference to another comment (that was deleted) and that suggested https://math.stackexchange.com/questions/1120202/matrices-that-commute-with-all-matrices – Junjiro Jan 15 '21 at 21:49
  • I don't know what question you’re talking about. I'm asking you to write down some mathematical statements and think about how the invertibility condition can be used. – Ted Shifrin Jan 15 '21 at 21:51
  • Sorry. At first I thought I could start by using a $2\times 2$ matrix, and say: if I assume an invertable matrix $B$, I could have the equation $B^{-1} A B = A$. $B^{-1}$ is easy to calculate, so it is straightforward. My problem is that I have to show the more general case of an invertable $n\times n$ matrix, but calculating $B^{-1}$ in this case is not as easy (so I think there should be a better approach). I also have to show that this answer is unique, so I think another approach is needed, and I don't know what should this approach be. – Junjiro Jan 15 '21 at 22:00
  • @Junjiro Take another approach. You have the freedom to pick any invertible matrix $A$ and use the fact that it commutes with your special matrix B (for which you want to prove that $B=\lambda E$) – peter.petrov Jan 15 '21 at 22:06
  • Do you know the change of basis formula? It's very hard to try to guess what tools you have available when you don't post more substance. – Ted Shifrin Jan 15 '21 at 22:20
  • At this point I only know the definition of a matrix in relation to two Basis and the definition of the multiplication of matrices ($ {C_2} D{C_1}=(d_{ij})\in K^{m\times n}$ ${B_3} (XY){B_1}={B_3}X{B_{2}} \cdot {B_2}Y{B_{1} $). I don't really know the change of basis formula. – Junjiro Jan 15 '21 at 22:34
  • At this point I only know the definition of a matrix in relation to two Basis and the definition of the multiplication of matrices ($ {C_2} D{C_1}=(d_{ij})\in K^{m\times n}$ ${B_3} (XY){B_1}={B_3}X{B_{2}} \cdot {B_2}Y{B_{1} }$). I don't really know the change of basis formula. – Junjiro Jan 15 '21 at 22:49
  • Btw, I think this is a nice question. The statement to prove is somewhat stronger than the one discussed here: https://math.stackexchange.com/questions/1120202/matrices-that-commute-with-all-matrices That's simply because here (in this question), we know that our matrix commutes only with the invertible matrices and not with all matrices. – peter.petrov Jan 15 '21 at 22:58
  • If $A$ and $P$, then $PAP^{-1}=A$. Choose invertible matrices $P$ and look at the result of $PAP^{-1}$ and what it implies on $A$: first consider $P=I$, except one diagonal element replaced with $k\ne0$. Repeat that on all diagonal elements and conclude that $A$ must be diagonal. Then consider $P=I$ except the $(i,j)$ element which is $k$ ($i\ne j$), and use that to prove that all diagonal elements must be equal. Done. – Jean-Claude Arbaut Jan 15 '21 at 23:24
  • I took the time and wrote a detailed solution to your problem. You can check it out, see if it helps. https://fxfyfz.blogspot.com/2021/01/if-matrix-commutes-with-all-invertible.html – peter.petrov Jan 16 '21 at 13:14

3 Answers3

3

Assume the square matrix $A=(a_{ij})$ of size $n$ commutes with all invertible square matrices of the same size $n$.

Take this particular matrix $B = {(b_{ij})}$ where $b_{ij} = i$ when $i=j$ and $b_{ij} = 0$ if $i \ne j$.

$B$ is obviously invertible.

Use the fact that it commutes with your matrix A. So we have $AB=BA$ When we write this down in details, this easily gives us that $a_{ij} = 0$ when $i\ne j$. So all elements of $A$ which are not on the main diagonal are zeros. Now we just need to prove that all elements along the main diagonal of $A$ are equal...

I am thinking on this part myself right now. Should be some simple trick too, I just haven't figured it out yet...

EDIT:

Oh, here it is, the solution for the final part... Multiply now the matrix $A$ to the Vandermonde matrix $V$ (you can pick the alphas in $V$ pretty much as you wish, just take them all distinct). Vandermonde matrix is invertible (as we know from theory, from its determinant) if all alphas in $V$ are pairwise distinct.

OK... Now use the fact that $A$ commutes with $V$ and compare the first columns of the two result matrices $VA$ and $AV$. From comparing the two first columns, you easily get now that the elements along the main diagonal of $A$ are all equal (and this completes the proof). You just need to write down $VA$ and $AV$ (e.g. for $n=3$) and you will notice the pattern and why the desired statement follows.

https://en.wikipedia.org/wiki/Vandermonde_matrix

peter.petrov
  • 12,568
  • Sorry for not answering, I am trying to understand your answer. Is A invertible? For a $2\times2$ let $A=((1,0),(0,2))$, then $A^{-1}=((2,0),(0,1))$ and $A A^{-1} = 2E \neq E$. Sorry if I am not correct. I am also reading about the Vandermonde matrix. – Junjiro Jan 15 '21 at 23:04
  • I just changed the notation to match your notation. You call $A$ the matrix that we know commutes with all invertible matrices. So I swapped the roles of $A$ and $B$ in my answer. So $B$ is the matrix which I constructed. $B$ is invertible as its determinant is just $n!$ and is non-zero. You have this theorem: a matrix is invertible if and only if its determinant is non-zero. You didn't construct properly the matrix $B^{-1}$, that's why you get $2E$ there. The inverse matrix is actually $((1,0),(0,0.5))$ – peter.petrov Jan 15 '21 at 23:10
  • Although it's very elegant, this proof doesn't work in fields of positive characteristic for n greather or equal to the field's characteristic. – Otomeram Aug 20 '23 at 04:18
3

I assume your matrices are over $\mathbb{R}$, although this approach can be modified using algebraic geometry to work over general fields.

There is a nice two-step approach: first show that a matrix that commutes with all invertible matrices in fact commutes with all matrices, and then show that the only matrices that do that are scalar multiples of the identity.

Now for step 1 there is a really nice surprising trick: use topology!

The key is that the set of invertible matrices is dense in the set of all matrices, meaning that for every matrix B, no matter how restrictive your notion of 'tiny' is, you can always find a tiny matrix $C$ such that $B + C$ is invertible. This is intuitively clear once you realize the same is true fore the determinants of $B, C$ and $B + C$.

Now given your matrix $A$ that commutes with all inverible matrices we have that the map that sends matrix $B$ to matrix $AB - BA$ is continuous and equals the zero matrix almost everywhere (on the dense subset of invertible $B$ that is) so that it follows that it equals $0$ everywhere.

Step 2 doesn't involve any surprise trick like this, so I leave that to you

Vincent
  • 10,614
  • 1
    I am sure the OP will appreciate the topology approach and this nice simple statement here in particular "the set of invertible matrices is dense in the set of all matrices" :) – peter.petrov Jan 15 '21 at 23:19
1

For any nonzero vector $u$, if $u$ and $Au$ are linearly independent, there exists some invertible matrix $B$ such that $Bu=u$ and $B(Au)=Au-u$. But then $0=(AB-BA)u=u$, which is a contradiction. Hence $u$ and $Au$ must be linearly dependent, i.e. $Au=c_uu$ for some scalar $c_u$.

This scalar $c_u$ actually does not depend on $u$. For, given any two nonzero vectors $u$ and $v$, if we pick an invertible matrix $B$ such that $Bu=v$, we obtain $0=(AB-BA)u=(c_v-c_u)v$, i.e. $c_u=c_v$. If we call this common value of all $c_u$s as $c$, we get $Au=cu$ for all nonzero vectors $u$. Hence $A=cI$.

user1551
  • 139,064