11

Let $V=M(n,\mathbb C)$. For a subset $S \subseteq V$, let $C(S):=\{A \in V | AB=BA, \forall B \in S \}$ .

How to prove that for every $A\in V$, we have $C(C (\{A\})) \subseteq \{ p(A) | p(t) \in \mathbb C[t] \}$ ?

My thoughts (motivated by Omnomnomnom's comment ): the reverse inclusion holds. Since both the sets are vector subspaces of $V$, enough to show their dimensions are equal. Now the dimension of the polynomial set is the degree of the minimal polynomial of $A$. So enough to show that dim$C(C(A)) \le $ degree of minimal polynomial of $A$.

uno
  • 1,560

3 Answers3

3

Let's say that an $R$-module $W$ has the double commutant property if $C_W(C_W(R))$ equals the image of $R$ in $End(W)$.

I'll give a proof via Jordan decomposition. One consequence of Jordan decomposition is: if we view $V$ as a $\mathbb{C}[x]$-module, then there exists a direct sum decomposition $$ V = V_1 \oplus \cdots \oplus V_n$$ into cyclic $\mathbb{C}[x]$-modules, i.e. for all $i$ there exists $v_i \in V_i$ such that $V_i = \mathbb{C}[x]v_i$.

Lemma 1: If $W$ is a cyclic $R$-module and $R$ is a commutative ring, then $C_W(R)$ is the image of $R$ in $End(W)$.

Proof: say $\varphi: W \to W$ intertwines $R$. If $v$ is the cyclic vector, then $\varphi(w) = rw$ for some $r \in R$. Then $\varphi(sw) = s\varphi(w) = srw = rsw$, so since $W = Rw$, we conclude that $\varphi$ is left-multiplication by $r$. Since $R$ is commutative, the reverse holds.

Lemma 2: If $W_1$ and $W_2$ are $R$-modules, then for $V = W_1 \oplus W_2$, $$C_V(C_V(R)) \subseteq C_{W_1}(C_{W_1}(R)) \oplus C_{W_2}(C_{W_2}(R)).$$

Proof: Suppose that $T: V \to V$ lies in $C_V(C_V(R))$. We can write $T$ uniquely as $T = T_{11} + T_{12} + T_{21} + T_{22}$ where $T_{ij} \in Hom(W_j,W_i)$. We aim to show that $T_{12} = 0$, $T_{21} = 0$, and that $T_{11}$ and $T_{22}$ lie in their respective double commutants.

Let $V = W_1 \oplus W_2$. By definition of direct sum, the structure maps $$ \pi_1: W_1 \oplus W_2 \to W_1 \to W_1 \oplus W_2 \qquad \pi_2: W_1 \oplus W_2 \to W_2 \to W_1 \oplus W_2$$ lie in $C_V(R)$. Then $\pi_1 T = T\pi_1$ since $T$ lies in the double commutant. Expanding into the direct sum $Hom(V,V) = \oplus_{ij} Hom(W_j,W_i)$ gives that $T_{11} + T_{12} = T_{11} + T_{21}$, implying that $T_{12} = 0$ and $T_{21} = 0$.

We may also observe that $C_{W_i}(R) \subseteq \oplus_{ij} Hom(W_j,W_i)$ is also contained in $C_V(R)$. If $\varphi_i \in C_{W_i}(R)$, then as $T$ is in the double commutant of $V$, $\varphi_iT_{ii} = T_{ii}\varphi_i$. Hence $T_{ii}$ is in the double commutant of $W_i$, as desired.

Now we can prove the desired result. Jordan decomposition states more than that we have a decomposition of $V$ into cyclic $\mathbb{C}[x]$-modules, but that we may in fact decompose into Jordan blocks, i.e. modules of the form $\mathbb{C}[x]/(x-\lambda)^n$. We will use these blocks to refine the inclusion in Lemma 2.

Lemma 3: If $V$ is a direct sum of Jordan blocks with the same eigenvalue, then $V$ has the double commutant property.

Proof: By replacing $x$ with $x -\lambda$, we may assume that the Jordan blocks have eigenvalue $0$, i.e. that $x$ acts nilpotently. Say that $V = \oplus_i V_i$ where $V_i = \mathbb{C}[x]/x^{n_i}$, and without loss of generality that $n_i \geq n_{i+1}$ for all $i$. Suppose that $T \in C_{V}(C_{V}(\mathbb{C}[x]))$; by Lemma 2, we may write $T = \oplus T_{ii}$ for $T_{ii} \in C_{V_i}(C_{V_i}(\mathbb{C}[x]))$. Now since $n_i \geq n_{i+1}$ for all $i$, we have $\mathbb{C}[x]$-module surjections $$ s_{i1}: \mathbb{C}[x]/x^{n_1} \to \mathbb{C}[x]/x^{n_i},$$ which may be viewed as maps $s_{i1}: V \to V$ by composing with the structure maps for the direct sum. Then $s_{i1} \in C_V(\mathbb{C}[x])$, so we have $s_{1i}T = Ts_{1i}$ for all $i$. Hence, $s_{i1}T_{11} = T_{ii}s_{i1}$. By Lemma 1, $C_{V_1}(C_{V_1}(\mathbb{C}[x])) = \mathbb{C}[x]/x^{n_1}$. So if $T_{11} = p(x)$, then $$ T_{ii}s_{i1} = s_{i1}T_{11} = s_{i1}p(x) = p(x)s_{i1}.$$ Since $s_{i1}$ is surjective, we conclude $T_{ii} = p(x)$ as well (as an endomorphism of $V_i$). As this holds for all $i$, $T = p(x)$ (as an endomorphism of $V$).

Lemma 4: If $W_1$ and $W_2$ are $\mathbb{C}[x]$-modules with the double commutant property and the minimal polynomials $r_1$ and $r_2$ of $x$ on $W_1$ and $W_2$ are coprime, then $W_1 \oplus W_2$ has the double commutant property.

Proof: Let $V = W_1 \oplus W_2$. By Lemma 2, if $\pi_i: W_1 \oplus W_2 \to W_i \to W_1 \oplus W_2$ are the structure maps, then $$ C_V(C_V(\mathbb{C}[x])) \subseteq \mathbb{C}[x]\pi_1 \oplus \mathbb{C}[x]\pi_2.$$ It suffices to show that any transformation $V \to V$ of the form $p_1(x) \pi_1 + p_2(x) \pi_2$ is in the image of $\mathbb{C}[x]$ in $End(V)$. So it suffices to show that $\pi_1$ and $\pi_2$ are in the image of $\mathbb{C}[x]$.

Since $r_1$ and $r_2$ are coprime, there exist polynomials $f_1, f_2$ such that $$f_1r_1 + f_2r_2 = 1.$$ Then $f_1r_1(x)= 1 - f_2r_2(x)$ acts as zero on $V_1$ and as the identity on $V_2$, so it acts as $\pi_2$ on $V$. This shows that $\pi_2$ and thus $\pi_1 = 1 - \pi_2$ are in the image of $\mathbb{C}[x]$, as desired.

Theorem: Any finite-dimensional $\mathbb{C}[x]$-module $V$ has the double commutant property.

Proof: By Jordan decomposition, we may decompose $V$ into summands $V_1\oplus\cdots \oplus V_n$ where $x$ acts with a single eigenvalue on each block. By Lemma 3, each $V_i$ has the double commutant property. Then induct via Lemma 4 to show that $V_1 \oplus \cdots \oplus V_k$ has the double commutant property for all $k$, as the minimal polynomial of $x$ on $V_1 \oplus \cdots \oplus V_{k-1}$ and $V_k$ have no roots in common.

  • What is your definition of $C_W (S)$ ? And what is the image of $R$ in $End (W)$ ? Are you identifying $R$ inside $End (W)$ as $r \in R$ corresponding to the map $f(w)=rw$ ? – uno Aug 23 '19 at 23:15
  • $C_W(S)$ is all of the abelian group endomorphisms of $W$ commuting with $S$. When $S$ contains $\mathbb C$, then all of these will be linear transformations, reducing to the situation in your question. And indeed, the module structure of $R$ on $W$ is exactly the assignment $R \to End(W)$ which you gave (although it is not generally injective). – Joshua Mundinger Aug 23 '19 at 23:22
  • What do you mean by "abelian group endomorphism" ? And what do you mean when you say an endomorphism commutes with $R$ ? Where does your $C_W(R)$ land in ? – uno Aug 24 '19 at 14:08
  • Well $W$ is a module, so it’s also an abelian group. So $C_W(R) \subset End(W)$, where I mean additive endomorphisms. The endomorphism $\varphi$ commutes with $R$ if for all $r$ in $R$, $\varphi(rx) = r\varphi(x)$, i.e. that $\varphi$ is an $R$-module homomorphism. – Joshua Mundinger Aug 24 '19 at 14:12
  • Okay, but then how do you define $C_W (C_W(R))$ ? – uno Aug 24 '19 at 14:28
  • So $C_W(C_W(R)) = { \varphi: W \to W \mid \varphi \circ \psi = \psi \circ \varphi \forall \psi \in C_W(R)}$. This is the same definition (where by $\varphi: W \to W$ I require $\varphi$ be additive). – Joshua Mundinger Aug 24 '19 at 14:29
  • If $W$ is a $\mathbb{C}$-vector space and $\mathbb{C} \subseteq S$, then $C_W(S)\supseteq \mathbb{C}$, so we can think about everything in terms of linear transformations instead, if you like. – Joshua Mundinger Aug 24 '19 at 14:31
1

It is sufficient to prove this result for a matrix similar to $A$, and so we can suppose that $A$ is in Jordan canonical form. Furthermore, let $\lambda$ be an eigenvalue of $A$, then $C(A)=C(A-\lambda I)$ and so we can replace $A$ by $A-\lambda I$ in order that $A$ has a zero eigenvalue.

If $A$ consists of a single block, then direct calculation shows that $C(A)$ is itself equal to $\{p(A) | p(t) \in \mathbb C[t] \}$ and hence the required result for $C(C(A))$. So we can suppose that there are at least two blocks.

Case 1. If at least two blocks have different eigenvalues

We can suppose $A$ is \begin{pmatrix}U&0\\0&V\end{pmatrix} where all the blocks corresponding to one particular eigenvalue are in $U$.

Let $M=$\begin{pmatrix}I&0\\0&0\end{pmatrix} where $I$ is the identity matrix of the same size as $U$.Then $M$ is in C(A) and $C(M)$ consists of matrices of the form \begin{pmatrix}R&0\\0&S\end{pmatrix} where $R$ has the same size as $I$.

A matrix $N$ in $C(C(A))$ is in $C(M)$ and so has this form and then, by induction, $N$ is \begin{pmatrix}f(U)&0\\0&g(V)\end{pmatrix} for some polynomials $f$ and $g$.

The characteristic polynomials $p_U$ and $p_V$ of $U$ and $V$ are coprime and so there are polynomials $u$ and $v$ such that $up_U+vp_V=1$. The matrix $N$ is then $h(A)$, where $$h=f+(g-f)up_U.$$

Case 2. If all blocks have eigenvalue $0$

Each block of $A$ now has $1$s on the super-diagonal and $0$s elsewhere. We shall first consider the case where $A$ has just two blocks $U$ and $V$ where we can suppose that dim$(V)$ is no greater than dim$(U)$. As in Case 1, any matrix in $C(C(A))$ has the form \begin{pmatrix}f(U)&0\\0&g(V)\end{pmatrix} for some polynomials $f$ and $g$. Let $h=g-f$ and then the matrix

$T=$\begin{pmatrix}0&0\\0&h(V)\end{pmatrix} is also in $C(C(A))$. Let $I$ be an identity matrix of same dimension as $V$ and let $I^*$ be the matrix with $\dim(U)-\dim(V)$ rows of $0$s added underneath $I$. Then \begin{pmatrix}0&{I^*}\\0&0\end{pmatrix} is in $C(A)$. For $T$ to commute with this matrix we require $I^*h(V)=0$ and therefore $h(V)=0$. Then the matrix of $C(C(A))$ is $f(A)$.

For any number of blocks, this argument can be used with the the block of largest dimension and any other block to complete the proof.

0

If $A$ is a linear map from a finite dimensional $F$ vector space $V$ to itself then, we get a $R:=F[x]$ module structure on $V$, where $x$ acts by $A$.

Let $p(x)$ be the minimal polynomial of $A$, then it is well-known from cyclic decomposition theorem that there is a vector $v\in V$ such that its minimal annihilating polynomial is $p(x)$ and if $W$ is the cyclic $R$-submodule generated by $v$, then there is an $R$-submodule $W'$ such that $V=W\oplus W'$. Let $e:V\rightarrow V$ be the idempotent associated to the projection onto $W$. Let $B$ be in $C(C(A))$, then $B$ commutes with $e\in C(A)=End_R(V)$ and hence $B(W)=B(eW)=e(B(W))\subseteq W$.

This implies that $Bv=f(A)v$ for some $f\in R$. We claim that $Bx=f(A)x$ for any $x\in V$. Define $e':V\rightarrow V$ to be the $R$-endomorphism that is zero on $W'$ and on $W$ is defined by sending $g(A)v$ to $g(A)x$ for any $g\in R$. This is well-defined since if $g(A)v=h(A)v$ then $g-h$ is a multiple of the minimal polynomial $p$ and hence $g(A)x=h(A)x$. Now $e'Bv=e'f(A)v=f(A)e'v=f(A)x$ and on the other hand $e'Bv=Be'v=Bx$. So we have proved that $B=f(A)$ is a polynomial in $A$.

Amir
  • 1