Let $A$ and $B$ be $n\times n$ matrices with complex entries such that $AB - BA$ is a linear combination of $A$ and $B$.
I'd like to prove that there exists a non-zero vector $v$ that is an eigenvector of both $A$ and $B$.
Let $A$ and $B$ be $n\times n$ matrices with complex entries such that $AB - BA$ is a linear combination of $A$ and $B$.
I'd like to prove that there exists a non-zero vector $v$ that is an eigenvector of both $A$ and $B$.
It is assumed throughout that
The given matrices $A$ and $B$ are linearly independent in the $\mathbb C$-vector space $M_n(\mathbb C)$,
thus ignoring the unremarkable setup where one matrix is a multiple of the other.
My next guess is that the question doesn't focus upon a vanishing commutator of $A$ and $B$, hence consider $$0\:\neq\:[A,B]\:=\:AB-BA\:=\:\alpha A+\beta B$$ and at least one of the scalars $\alpha,\beta $ is not zero. Divide by a scalar, say $\alpha\neq0$, then set $$B'=\frac1\alpha B\;\text{ and }\; A'= A+\beta B'\quad\text{to obtain }\; \big[\,A',B'\big]\:=\:A'\,.$$ Thus in the sequel we may start w.l.o.g. from $\,\mathbf{AB-BA=A\qquad\qquad (\star)}$
I decided to spill some more MathJax ink here & now, to give examples and a bit Lie algebra background as well. Who's not willing to go through all this should ...
simply jump: Goto The shared eigenvector !
Examples
how to satisfy the relation above:
If $\,n=2$, then $\,A=\begin{pmatrix}0&1\\ 0&0\end{pmatrix}\:$ and $\;B=\frac 12\begin{pmatrix} -1&0\\ 0&1\end{pmatrix}\;$ do the job. $\:\begin{pmatrix}1\\ 0\end{pmatrix}$ is a joint eigenvector.
For $n=3$ one may consider $$A\:=\:\begin{pmatrix}0&a_2&a_3\\ 0&0&0\\ 0&0&0\end{pmatrix}\;,\quad B\:=\:\begin{pmatrix}b&0&0\\ 0&b+1&0\\ 0&0&b+1\end{pmatrix}$$ where $a_2,a_3,b\in\mathbb C$,and $a_2,a_3$ not both zero.
It is not by accident that $A$ is represented by a nilpotent matrix since this cannot be avoided:
For each $n\in\mathbb N$ one has
$$\big[\,A^n,B\big]\:=\:n\,A^n,$$
the proof by induction using
$\,[AA^{n-1},B] = A[A^{n-1},B] + [A,B]\,A^{n-1}$ (derivation property) is straightforward. Let $\|\cdot\|$ be a submultiplicative matrix norm, then
$$n\|A^n\|\:=\:\|A^nB-BA^n\|\;\leq\;2\|B\| \|A^n\|\quad\forall\,n$$
which implies $A^n=0\,$ at the latest when $\,n>2\|B\|$. Thus $A$ is nilpotent.
The shared eigenvector
for $B$ and $A$ does exist: Pick an eigenvector $w\neq 0$ to an eigenvalue $\mu$ of $B$ such that $\,\mu-1\,$ is no eigenvalue of $B$. Insert $w$ into $(\star)$
$$BAw\:=\:ABw - Aw\:=\:(\mu-1)Aw\quad\Longrightarrow\; Aw\:=\:0\,,$$
thus $w$ is also an eigenvector of $A$, to the eigenvalue zero.
Lie algebra background
$A$ and $B$ span a complex Lie algebra with Lie bracket given by $(\star)$. It is up to (Lie algebra) isomorphism the unique complex Lie algebra, which is 2-dimensional and nonabelian. It is solvable, but not nilpotent.
Lie's theorem guarantees (in any matrix representation) the existence of a shared eigenvector for all elements of a solvable Lie algebra. One implication is that all matrices of a representation may be chosen to be triangular.
Related posts