3

Let $A$ and $B$ be $n\times n$ matrices with complex entries such that $AB - BA$ is a linear combination of $A$ and $B$.

I'd like to prove that there exists a non-zero vector $v$ that is an eigenvector of both $A$ and $B$.

Hanno
  • 6,302
  • Anything you tried to solve this problem? – Hetebrij Nov 29 '15 at 22:17
  • So, I know that eigenvectors are the solutions such that the determinant is zero when you know the eigenvalues to each. I assume that, since the matrices are distinct, their eigenvalues will have to be different for that matching eigenvector. Otherwise, I am very lost about why the linear combination of A and B is important. – Mark Frazier Nov 29 '15 at 22:27
  • What is given means $AB-BA=\lambda A+\mu B$. Use that – Denis Düsseldorf Nov 29 '15 at 22:33
  • The linear combination of $A$ and $B$ is important because in general we do not know if $A$ and $B$ are diagonalizable w.r.t. the same basis. However, in the more general case that $AB-BA=0$, we do know that $A$ and $B$ are diagonalizable w.r.t. the same basis. – Hetebrij Nov 29 '15 at 23:41
  • In fact $A,B$ are simultaneously triangularizable. If you know Lie theory, then it is very easy. Otherwise do as follows. 1. Reduce the problem to the case $AB-BA=A$. 2. Calculate, by a recurrence reasoning, $A^kB-BA^k$. 3. Show that $A$ is nilpotent. 4. Consider $\ker(A)$. –  Dec 03 '15 at 15:13

1 Answers1

1

It is assumed throughout that

The given matrices $A$ and $B$ are linearly independent in the $\mathbb C$-vector space $M_n(\mathbb C)$,
thus ignoring the unremarkable setup where one matrix is a multiple of the other.

My next guess is that the question doesn't focus upon a vanishing commutator of $A$ and $B$, hence consider $$0\:\neq\:[A,B]\:=\:AB-BA\:=\:\alpha A+\beta B$$ and at least one of the scalars $\alpha,\beta $ is not zero. Divide by a scalar, say $\alpha\neq0$, then set $$B'=\frac1\alpha B\;\text{ and }\; A'= A+\beta B'\quad\text{to obtain }\; \big[\,A',B'\big]\:=\:A'\,.$$ Thus in the sequel we may start w.l.o.g. from $\,\mathbf{AB-BA=A\qquad\qquad (\star)}$

I decided to spill some more MathJax ink here & now, to give examples and a bit Lie algebra background as well. Who's not willing to go through all this should ...
simply jump: Goto The shared eigenvector !

Examples
how to satisfy the relation above:

  • If $\,n=2$, then $\,A=\begin{pmatrix}0&1\\ 0&0\end{pmatrix}\:$ and $\;B=\frac 12\begin{pmatrix} -1&0\\ 0&1\end{pmatrix}\;$ do the job. $\:\begin{pmatrix}1\\ 0\end{pmatrix}$ is a joint eigenvector.

  • For $n=3$ one may consider $$A\:=\:\begin{pmatrix}0&a_2&a_3\\ 0&0&0\\ 0&0&0\end{pmatrix}\;,\quad B\:=\:\begin{pmatrix}b&0&0\\ 0&b+1&0\\ 0&0&b+1\end{pmatrix}$$ where $a_2,a_3,b\in\mathbb C$,and $a_2,a_3$ not both zero.

It is not by accident that $A$ is represented by a nilpotent matrix since this cannot be avoided:
For each $n\in\mathbb N$ one has $$\big[\,A^n,B\big]\:=\:n\,A^n,$$ the proof by induction using $\,[AA^{n-1},B] = A[A^{n-1},B] + [A,B]\,A^{n-1}$ (derivation property) is straightforward. Let $\|\cdot\|$ be a submultiplicative matrix norm, then $$n\|A^n\|\:=\:\|A^nB-BA^n\|\;\leq\;2\|B\| \|A^n\|\quad\forall\,n$$ which implies $A^n=0\,$ at the latest when $\,n>2\|B\|$. Thus $A$ is nilpotent.

The shared eigenvector
for $B$ and $A$ does exist: Pick an eigenvector $w\neq 0$ to an eigenvalue $\mu$ of $B$ such that $\,\mu-1\,$ is no eigenvalue of $B$. Insert $w$ into $(\star)$ $$BAw\:=\:ABw - Aw\:=\:(\mu-1)Aw\quad\Longrightarrow\; Aw\:=\:0\,,$$ thus $w$ is also an eigenvector of $A$, to the eigenvalue zero.

Lie algebra background
$A$ and $B$ span a complex Lie algebra with Lie bracket given by $(\star)$. It is up to (Lie algebra) isomorphism the unique complex Lie algebra, which is 2-dimensional and nonabelian. It is solvable, but not nilpotent.
Lie's theorem guarantees (in any matrix representation) the existence of a shared eigenvector for all elements of a solvable Lie algebra. One implication is that all matrices of a representation may be chosen to be triangular.

Related posts

Hanno
  • 6,302