2

Let $V$ be a finite dimensional vector space, $V \ne \{ 0 \}$ and $A\in L(V)$ an operator which commutes with every operator from $L(V)$. Show that there is a scalar $\lambda$ so that the following applies: $A = \lambda I$.

$L(V)$ = The set of all linear mappings (linear operators) from V to V

If $B\in L(V)$ is arbitrary and if I understood the assignment correctly, this would mean $AB = BA \implies AB - BA = 0 $

I believe, a matrix representation of the linear operators would apply here but I don't have any idea what to do here.

EDIT (Hint by Jared)

I tried to examine it on 5x5 matrices: example 1, example 2.

For example, for:
$\begin{bmatrix} a_{11} & a_{12} & a_{13} & a_{14} & a_{15} \\ a_{21} & a_{22} & a_{23} & a_{24} & a_{25} \\ a_{31} & a_{32} & a_{33} & a_{34} & a_{35} \\ a_{41} & a_{42} & a_{43} & a_{44} & a_{45} \\ a_{51} & a_{52} & a_{53} & a_{54} & a_{55}\end{bmatrix} \begin{bmatrix} 0 & 0 & 0 & 0 & 0 \\ 0 & 0 & 0 & 1 & 0 \\ 0 & 0 & 0 & 0 & 0 \\ 0 & 0 & 0 & 0 & 0 \\ 0 & 0 & 0 & 0 & 0\end{bmatrix} = \begin{bmatrix} 0 & 0 & 0 & a_{12} & 0 \\ 0 & 0 & 0 & a_{22} & 0 \\ 0 & 0 & 0 & a_{32} & 0 \\ 0 & 0 & 0 & a_{42} & 0 \\ 0 & 0 & 0 & a_{52} & 0\end{bmatrix}$

$\begin{bmatrix} 0 & 0 & 0 & 0 & 0 \\ 0 & 0 & 0 & 1 & 0 \\ 0 & 0 & 0 & 0 & 0 \\ 0 & 0 & 0 & 0 & 0 \\ 0 & 0 & 0 & 0 & 0\end{bmatrix} \begin{bmatrix} a_{11} & a_{12} & a_{13} & a_{14} & a_{15} \\ a_{21} & a_{22} & a_{23} & a_{24} & a_{25} \\ a_{31} & a_{32} & a_{33} & a_{34} & a_{35} \\ a_{41} & a_{42} & a_{43} & a_{44} & a_{45} \\ a_{51} & a_{52} & a_{53} & a_{54} & a_{55}\end{bmatrix} = \begin{bmatrix} 0 & 0 & 0 & 0 & 0 \\ a_{41} & a_{42} & a_{43} & a_{44} & a_{45} \\ 0 & 0 & 0 & 0 & 0 \\ 0 & 0 & 0 & 0 & 0 \\ 0 & 0 & 0 & 0 & 0\end{bmatrix}$

which means $a_{22}=a_{44}$. For a 1 on the $ij$th and $ji$th position, I get 2 equations (actually 4 but 2 of them are equal to the other 2).

But I still don't understand what I can conclude from that. I reviewed older homework assignments and found the same problem but with a hint: "Show that A has at least one eigenvalue and observe the corresponding subspace of the eigenvalue. Before that, show that every subspace of an eigenvalue of the operator $T \in L(V)$ is invariant for every operator $S \in L(V)$ which commutes with $T$."

EDIT 2

For $B=E_{ij}$ I conclude that $ AB=BA \implies a_{ii}=a_{jj}, a_{ij} = 0$ if $i \ne j$
(In my example, i is 2 and j is 4 so it turned out $a_{ii}=a_{22}=a_{44}=a_{jj}$)

For $B = E_{ij} + E_{ji}, AB = BA \implies a_{ii} = a_{jj}, a_{ij} = a_{ji}, a_{kl} = 0$ for $ k=l \ne i,$ and $ k=l \ne j $ and $ k\ne i \space \& \space l\ne j$ and $ k\ne j \space \& \space l\ne i$

But we can write B in a different way:
$B = \sum_{i=1}^n\sum_{j=1}^n\beta_{ij}E_{ij}$

$AB = A(\sum_{i=1}^n\sum_{j=1}^n\beta_{ij}E_{ij}) =$ associativity of matrix and scalar multiplication & distributivity over matrix addition $=\sum_{i=1}^n\sum_{j=1}^n(\beta_{ij}A)E_{ij} = \sum_{i=1}^n\sum_{j=1}^n\beta_{ij}(AE_{ij})$

$BA = (\sum_{i=1}^n\sum_{j=1}^n\beta_{ij}E_{ij})A = \sum_{i=1}^n\sum_{j=1}^n\beta_{ij}E_{ij}A = \sum_{i=1}^n\sum_{j=1}^nE_{ij}(A\beta_{ij}) = \sum_{i=1}^n\sum_{j=1}^n(E_{ij}A)\beta_{ij}$

$AB = BA \implies \beta_{ii}a_{ii} = \beta_{jj}a_{jj}, \forall i,j\in \{1, ... , n\} \space \& \space \beta_{ij}a_{ij} = 0 \space , i \ne j$

Which would give us a system of $n^2$ linear equations:
$\beta_{11}a_{11} = \beta_{11}a_{11}$
$\beta_{11}a_{11} = \beta_{22}a_{22}$
...
$\beta_{11}a_{11} = \beta_{nn}a_{nn}$

$\beta_{22}a_{22} = \beta_{11}a_{11}$
$\beta_{22}a_{22} = \beta_{22}a_{22}$
...
$\beta_{22}a_{22} = \beta_{nn}a_{nn}$

...

$\beta_{nn}a_{nn} = \beta_{11}a_{11}$
$\beta_{nn}a_{nn} = \beta_{22}a_{22}$
...
$\beta_{nn}a_{nn} = \beta_{nn}a_{nn}$

$\implies \beta_{11}a_{11} = \beta_{22}a_{22} = ... = \beta_{nn}a_{nn} = \beta a$ $\implies A = \lambda I$ for $\lambda = \beta a$

AltairAC
  • 1,114
  • 3
    Choose any basis for $V$, and let $E_{ij}$ be the matrix with a $1$ in the $ij$th position, and $0$ elsewhere. See what the commutation relation implies about $A$ when you apply it to matrices $B=E_{ij}$ and $B=E_{ij}+E_{ji}$. – Jared May 15 '13 at 22:02
  • Hi Jared, thank you for your answer, I edited my post and wanted to know if your hint was intended as a help to conclude the other hint I found on the older homework assignment or do you have a different proof in mind? – AltairAC May 17 '13 at 22:23
  • 1
    Not only have you concluded that $a_{22}=a_{44}$, but also that $a_{12}=a_{32}=\ldots=a_{45}=0$. In general, you've shown $a_{ij}=0$ if $i\ne j$ and $a_{11}=a_{22}=\ldots=a_{nn}$. This proof is different than the one to which your hint is leading you. – Jared May 17 '13 at 22:28
  • I thought about it and the result can be seen in EDIT 2. For some reason, I doubt that my reasoning is correct but I may be one step further. (?) – AltairAC May 18 '13 at 16:36
  • Thanks everybody, the given link contains excellent answers so I don't have any intentions to edit this question any further! – AltairAC May 18 '13 at 20:14

2 Answers2

2

Let $E_{i,j}$ be the matrix with $1$ in $i,j$ position and 0 everywhere else.
Use that $AE_{1,i}=E_{1,i}A$ to show that $a_{1,1}=a_{i,i}$, $a_{j,1}$ for $j=2,3,\ldots,n$ and $a_{i,j}=0$ for $j\neq i$.

P..
  • 14,929
1

Note that $Tx=ax\implies T(Sx) = S(Tx) = a (Sx)$ and hence $Sx$ is in the $a$ eigenspace of $T$ along with $x$.

Hence if $T$ has one eigenvalue, by choosing an $S$ that maps a corresponding eigenvector to an arbitrary $y$, we see that $y$ has the same eigenvalue. Hence $T\propto I$.

Now you just need $T$ to have one eigenvalue. Lots of ways to do this. For example, choose $S$ to have the maximal $n$ distinct eigenvalues. Then $T$ must map each eigenvector to a multiple of itself, and hence has not just one but $n$ eigenvectors.

This result is essentially Schur's Lemma.

not all wrong
  • 16,178
  • 2
  • 35
  • 57
  • Since Jared posted only comments and you gave a very detailed answer for the second way to prove it, I decided to choose your answer as the accepted answer, thanks! – AltairAC May 18 '13 at 20:18