Let $V$ be a finite dimensional vector space, $V \ne \{ 0 \}$ and $A\in L(V)$ an operator which commutes with every operator from $L(V)$. Show that there is a scalar $\lambda$ so that the following applies: $A = \lambda I$.
$L(V)$ = The set of all linear mappings (linear operators) from V to V
If $B\in L(V)$ is arbitrary and if I understood the assignment correctly, this would mean $AB = BA \implies AB - BA = 0 $
I believe, a matrix representation of the linear operators would apply here but I don't have any idea what to do here.
EDIT (Hint by Jared)
I tried to examine it on 5x5 matrices: example 1, example 2.
For example, for:
$\begin{bmatrix} a_{11} & a_{12} & a_{13} & a_{14} & a_{15}
\\ a_{21} & a_{22} & a_{23} & a_{24} & a_{25}
\\ a_{31} & a_{32} & a_{33} & a_{34} & a_{35}
\\ a_{41} & a_{42} & a_{43} & a_{44} & a_{45}
\\ a_{51} & a_{52} & a_{53} & a_{54} & a_{55}\end{bmatrix}
\begin{bmatrix} 0 & 0 & 0 & 0 & 0
\\ 0 & 0 & 0 & 1 & 0
\\ 0 & 0 & 0 & 0 & 0
\\ 0 & 0 & 0 & 0 & 0
\\ 0 & 0 & 0 & 0 & 0\end{bmatrix} =
\begin{bmatrix} 0 & 0 & 0 & a_{12} & 0
\\ 0 & 0 & 0 & a_{22} & 0
\\ 0 & 0 & 0 & a_{32} & 0
\\ 0 & 0 & 0 & a_{42} & 0
\\ 0 & 0 & 0 & a_{52} & 0\end{bmatrix}$
$\begin{bmatrix} 0 & 0 & 0 & 0 & 0 \\ 0 & 0 & 0 & 1 & 0 \\ 0 & 0 & 0 & 0 & 0 \\ 0 & 0 & 0 & 0 & 0 \\ 0 & 0 & 0 & 0 & 0\end{bmatrix} \begin{bmatrix} a_{11} & a_{12} & a_{13} & a_{14} & a_{15} \\ a_{21} & a_{22} & a_{23} & a_{24} & a_{25} \\ a_{31} & a_{32} & a_{33} & a_{34} & a_{35} \\ a_{41} & a_{42} & a_{43} & a_{44} & a_{45} \\ a_{51} & a_{52} & a_{53} & a_{54} & a_{55}\end{bmatrix} = \begin{bmatrix} 0 & 0 & 0 & 0 & 0 \\ a_{41} & a_{42} & a_{43} & a_{44} & a_{45} \\ 0 & 0 & 0 & 0 & 0 \\ 0 & 0 & 0 & 0 & 0 \\ 0 & 0 & 0 & 0 & 0\end{bmatrix}$
which means $a_{22}=a_{44}$. For a 1 on the $ij$th and $ji$th position, I get 2 equations (actually 4 but 2 of them are equal to the other 2).
But I still don't understand what I can conclude from that. I reviewed older homework assignments and found the same problem but with a hint: "Show that A has at least one eigenvalue and observe the corresponding subspace of the eigenvalue. Before that, show that every subspace of an eigenvalue of the operator $T \in L(V)$ is invariant for every operator $S \in L(V)$ which commutes with $T$."
EDIT 2
For $B=E_{ij}$ I conclude that $ AB=BA \implies a_{ii}=a_{jj}, a_{ij} = 0$ if $i \ne j$
(In my example, i is 2 and j is 4 so it turned out $a_{ii}=a_{22}=a_{44}=a_{jj}$)
For $B = E_{ij} + E_{ji}, AB = BA \implies a_{ii} = a_{jj}, a_{ij} = a_{ji}, a_{kl} = 0$ for $ k=l \ne i,$ and $ k=l \ne j $ and $ k\ne i \space \& \space l\ne j$ and $ k\ne j \space \& \space l\ne i$
But we can write B in a different way:
$B = \sum_{i=1}^n\sum_{j=1}^n\beta_{ij}E_{ij}$
$AB = A(\sum_{i=1}^n\sum_{j=1}^n\beta_{ij}E_{ij}) =$ associativity of matrix and scalar multiplication & distributivity over matrix addition $=\sum_{i=1}^n\sum_{j=1}^n(\beta_{ij}A)E_{ij} = \sum_{i=1}^n\sum_{j=1}^n\beta_{ij}(AE_{ij})$
$BA = (\sum_{i=1}^n\sum_{j=1}^n\beta_{ij}E_{ij})A = \sum_{i=1}^n\sum_{j=1}^n\beta_{ij}E_{ij}A = \sum_{i=1}^n\sum_{j=1}^nE_{ij}(A\beta_{ij}) = \sum_{i=1}^n\sum_{j=1}^n(E_{ij}A)\beta_{ij}$
$AB = BA \implies \beta_{ii}a_{ii} = \beta_{jj}a_{jj}, \forall i,j\in \{1, ... , n\} \space \& \space \beta_{ij}a_{ij} = 0 \space , i \ne j$
Which would give us a system of $n^2$ linear equations:
$\beta_{11}a_{11} = \beta_{11}a_{11}$
$\beta_{11}a_{11} = \beta_{22}a_{22}$
...
$\beta_{11}a_{11} = \beta_{nn}a_{nn}$
$\beta_{22}a_{22} = \beta_{11}a_{11}$
$\beta_{22}a_{22} = \beta_{22}a_{22}$
...
$\beta_{22}a_{22} = \beta_{nn}a_{nn}$
...
$\beta_{nn}a_{nn} = \beta_{11}a_{11}$
$\beta_{nn}a_{nn} = \beta_{22}a_{22}$
...
$\beta_{nn}a_{nn} = \beta_{nn}a_{nn}$
$\implies \beta_{11}a_{11} = \beta_{22}a_{22} = ... = \beta_{nn}a_{nn} = \beta a$ $\implies A = \lambda I$ for $\lambda = \beta a$