0

Suppose $V$ is a real vector space of dimension $n\gt1$. What are all the linear transformations $T:V\to V$ such that the matrix for $T$ is independent of basis?

This would mean that for any change of basis matrix $P$ we would have $$T=PTP^{-1}\iff TP=PT.$$ Any invertible matrix can be seen as a change of basis matrix (I think!) and vice-versa so such a matrix would need to commute with all invertible matrices.

From here though I cannot see how I would be able to deduce $T$'s form. Any hints would be great!

jcneek
  • 714
  • 2
  • 13

1 Answers1

1

$T$ can just be a multiple of the identity. The easiest way that it comes to my mind right now is this.

Step 1: $T$ must be diagonalizable, because both $T$ and $T^t$ can be written in Jordan form, thus $T$ is always similar to an upper triangular and to a lower triangular matrices. Thus, if you want what you have written, these two matrices must be the same one, thus they must be diagonal. Thus, $T$ is diagonalizable (and this also means that $T$ is diagonal because it is the same for all the changes of basis)

Step 2: $T$ must have all the same eigenvalues. In fact there are always change of basis such that we can arbitrarily permute the entries of the diagonal. Thus, if the matrix has to remain the same one for all change of basis, then $T$ has just one eigenvalue, thus it is just a multiple of the identity.

Lorenzo Pompili
  • 4,182
  • 8
  • 29
  • Thank you for your answer. In my course I have not covered Jordan form. I have a hint that says 'Show that the invertible square matrices of dimension $n$ span the square matrices of dimension $n$.' Is there a way to approach this without using Jordan Form? – jcneek Jul 05 '21 at 14:07
  • 1
    I am sure there are elementary ways to do it. The hint probably means this: first prove what the hint says; this implies that, if $T$ satisfies the hypothesis above, then it commutes with all matrices by linearity. Then, prove that the only matrices that commute with every matrix are multiples of the identity, see https://math.stackexchange.com/questions/27808/a-linear-operator-commuting-with-all-such-operators-is-a-scalar-multiple-of-the – Lorenzo Pompili Jul 05 '21 at 14:24
  • In fact $TP=PT$ if $P$ is the sum of two invertible matrices. Hence $TP=PT$ for every $P$ because every matrix is the sum of two invertible matrices: $P=(\frac12P+cI)+(\frac12P-cI)$, and $\frac12P\pm cI$ is invertible if $|c|>\frac12||P||$. – David C. Ullrich Jul 06 '21 at 10:48