1

Let $V$ be a finitely generated inner product space above $\mathbb{F}$. Denote by ${\rm End}\left(V\right)$ the space linear transformations $V\to V$. For $A\subseteq{\rm End}\left(V\right)$ Denote $$C\left(A\right)=\left\{ X\in{\rm End}\left(V\right)|X\circ S=S\circ X\quad\forall S\in A\right\} $$

For $S\in{\rm End}\left(V\right)$, denote $C\left(S\right)=C\left(\left\{S\right\} \right)$ and denote

$${\rm Pol}\left(S\right)=\left\{ p\left(S\right):p\in\mathbb{F}\left[t\right]\right\} $$

Show that if $S$ is diagonalizable then ${\rm Pol}\left(S\right)=C\left(C\left(S\right)\right)$.

I know that is $S$ has distinct eigenvalue then any transformation in $C(S)$ must be a polynomial in $S$, but apart from that I'm not really sure how to proceed.

One side I was able to do:

• $(\subseteq )$

Let $p\in\mathbb{F}\left[t\right]$, then $p\left(S\right)\in{\rm Pol}\left(S\right)$ and we need to prove $p\left(S\right)\in C\left(C\left(S\right)\right)$, that is, for all $T\in C\left(S\right)$ we want to prove that $p\left(S\right)\circ T=T\circ p\left(S\right)$. Let $T\in C\left(S\right)$. Then by definition $T\circ S=S\circ T$. Then $$p\left(S\right)\circ T=\left(\sum_{i=0}^{n}a_{i}S^{i}\right)\circ T=\sum_{i=0}^{n}a_{i}S^{i}T=\sum_{i=0}^{n}a_{i}TS^{i}=T\sum_{i=0}^{n}a_{i}S^{i}$$ as required.

A bit stuck about the other side.

Nescio
  • 2,426

1 Answers1

1

Sketch of proof: With an appropriate choice of basis, we can take $S$ to be a diagonal matrix. That is, $$ S = \pmatrix{\lambda_1 I_{m_1} \\ & \lambda_2 I_{m_2}\\ &&\ddots\\ &&& \lambda_k I_{m_k} } $$ here, $\lambda_i$ are the (distinct!) eigenvalues of $S$, $I_m$ is the $m \times m$ identity, and $m_{i}$ is the multiplicity of the $i$th eigenvalue.

First, find that $C(S)$ is the set of matrices of the form $$ \pmatrix{A_1\\&A_2\\&&\ddots \\&&&A_k} $$ where $A_i$ is an arbitrary $m_i \times m_i$ matrix.

Next, find that $C(C(S))$ is the set of matrices of the form $$ \pmatrix{\mu_1 I_{m_1} \\ & \mu_2 I_{m_2}\\ &&\ddots\\ &&& \mu_k I_{m_k} } $$ where $\mu_i \in \Bbb F$ are arbitrary. It should be clear from this point that $Pol(S) \subset C(C(S))$ (though for you, this is unnecesssary). For the reverse inclusion, it suffices to note that for any $\mu_1,\mu_2,\dots,\mu_k$, it suffices to select an interpolating polynomial satisfying $$ p(\lambda_i) = \mu_i \qquad i=1,\dots,k $$

Ben Grossmann
  • 225,327
  • 1
    The key to going through the proof this way is a certain degree of comfort with block-matrix multiplication. – Ben Grossmann Dec 28 '15 at 19:03
  • 1
    I keep putting discussion of the general case (not quite the style of this question) at http://math.stackexchange.com/questions/92480/given-a-matrix-is-there-always-another-matrix-which-commutes-with-it/92832#92832 – Will Jagy Dec 28 '15 at 20:20
  • @WillJagy very nice. Feel free to add this answer to the collection if you find it sufficiently relevant. – Ben Grossmann Dec 28 '15 at 20:26
  • That makes sense.. Had troubles using the fact S is diagonalizable, but this makes it doable for me. – Nescio Dec 29 '15 at 08:44