5

I'm trying to prove the following:

If $S\colon V\to V$ and $T\colon V\to V$ are unitary linear transformations on unitary space $V$ ($\dim V=n$, $n$ is finite), such that $ST=TS$, then they have a joint eigenvector basis (aka there is a basis of $V$ composed of eigenvectors of both $S$ and $T$ - not necessarily of the same eigenvalue per each).

Can anyone help me out? I've tried rephrasing the 'matrix equivalent' of the theorem, but I didn't get much further.

Thanks!

Arturo Magidin
  • 398,050
  • 1
    Hint: Since $S$ and $T$ commute, show that each eigenspace of $S$ is $T$-invariant (that is, if $Sv = \lambda v$, we have $S(Tv) = \lambda (Tv)$. – Geoff Robinson Aug 12 '11 at 22:49
  • This can be strengthened to the conclusion that the basis is orthonormal. @iroiroaru: Are you able and willing to use the fact that a single unitary transformation has an eigenvector (orthonormal) basis? – Jonas Meyer Aug 12 '11 at 22:52
  • Hi Geoff, I've actually realized that but I wasn't able to see how it "helps me out"... I guess I just have no idea how to start building the actual basis. Jonas: yes, certainly, we've covered it in class. – iroiroaru Aug 12 '11 at 22:52
  • @iroiroaru: It's difficult for me to go much further without telling you the whole answer. As Jonas says, you really need to use the fact that a single unitary linear transformation has an othonormal basis of eigenvectors. – Geoff Robinson Aug 12 '11 at 23:06
  • BTW there is a really beautiful abstract proof of a somewhat more general statement here: http://planetmath.org/encyclopedia/CommutingMatrices.html – John M Aug 13 '11 at 11:21

2 Answers2

5

Based on what you've already covered in class, there is an orthonormal basis with respect to which $S$ has matrix

$$A= \begin{pmatrix} \lambda_1 I_{k_1} & 0 & \cdots & 0 \\ 0 & \lambda_2 I_{k_2} & \cdots & 0 \\ \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & \cdots & \lambda_m I_{k_m} \end{pmatrix},$$

where $k_i$ is the dimension of the eigenspace for the eigenvalue $\lambda_i$ of $S$. If $B$ is the matrix of $T$ with respect to this basis, then because $AB=BA$ you have

$$B= \begin{pmatrix} B_{1} & 0 & \cdots & 0 \\ 0 & B_{2} & \cdots & 0 \\ \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & \cdots & B_{m} \end{pmatrix},$$

where $B_i$ is a $k_i$-by-$k_i$ matrix (e.g., see here). Since each $B_i$ is unitary, each can be unitarily diagonalized. Note that doing so leaves $A$ unchanged.

Jonas Meyer
  • 53,602
3

Jonas' answer was excellent and helped me a lot, but today I thought of a different direction and it'd be nice if you fellows could help me tell whether it works:

Let $V_1, V_2, ..., V_k$ be the eigenspaces pertaining to eigenvalues $\lambda_1,\lambda_2,...\lambda_k$ of S. Since S is T-invariant, we know that $T(V_i)\subseteq V_i$, which means the reduction of T to the eigenspace $V_i$, $T_i:V_i\to V_i$, is also a unitary transform. Subsequently $T_i$ has an orthonormal eigenvector basis in $V_i$. Because S is unitary, the bases found for the $T_i$s contain vectors orthonormal to each other, and so their union would be an orthonormal eigenvector basis of T, which, because S is unitary, would also be an eigenvector basis of S.

Is this proof valid-looking? Thanks!

  • It seems to me that what you say is correct, and is the same as what Jonas says. What's the difference that you see? – Pierre-Yves Gaillard Aug 13 '11 at 11:40
  • I suppose to a more experienced person it would seem like the same argument, though I can't entirely see it (hey, this /is/ mathematics for all levels ;). I suppose the main difference is that this is the 'linear transform language' equivalent to Jonas's idea, which I personally found a bit less complex than the matrix proof. I was just posting it to see if I have it right. – iroiroaru Aug 13 '11 at 11:56
  • To make a judgement, I should read carefully your question, Jonas’s answer (with the links), and your answer. I haven’t really done that, but what you say strikes me as highly sensible. In particular I couldn’t agree more with what you said about linear transforms vs matrices. I'll say: you're on the right track! (I voted for your answer and question.) – Pierre-Yves Gaillard Aug 13 '11 at 12:01
  • Thank you for your feedback! – iroiroaru Aug 13 '11 at 12:04
  • @iroiroaru: This is the sort of proof I was trying to point you towards in my earlier hint, and is correct: succinctly, $T$ has an orthonormal basis of eigenvectors on each eigenspace of $S$. Put all these together, and you get an orthonormal basis which consists of vectors which are simultaneously eigenvectors for $T$ and for $S$. – Geoff Robinson Aug 13 '11 at 14:08