4

I am required to prove that if $S$ and $T$ are linear operator on a vector space $V$ then $ST$ and $TS$ have the same eigenvalues could you please provide some hints to get me going without revealing the complete solution.

In addition it would be helpful if you did not refer to characteristic polynomials or determinants in your answer.

  • 3
    Assuming we are dealing with finite-dimensional vector spaces, invertible linear operators form a dense set. If $S$ is invertible then $ST$ and $TS$ are conjugated via $TS=S^{-1}(ST)S$, so the share the same spectrum. – Jack D'Aurizio Feb 25 '18 at 18:14
  • @JackD'Aurizio that is a beautiful argument – Andres Mejia Feb 25 '18 at 18:20
  • 1
    @Jack: nice argument, but I don't think it works if $V$ is over a finite field and $S$ and $T$ aren't invertible. – Peter Shor Feb 25 '18 at 18:22
  • @PeterShor: the point is that by density you may always assume that at least one operator between $S$ and $T$ is invertible. Of course you have a point about finite fields. – Jack D'Aurizio Feb 25 '18 at 18:25
  • 2
    If $ST ,\vec v =\lambda ,\vec v$ then $\lambda T, \vec v = T (\lambda , \vec v)= T (ST ,\vec v)= TS (T ,\vec v)$ – lulu Feb 25 '18 at 18:26
  • @lulu Almost. There's actually a gap there - see the answer I'm about to post. – David C. Ullrich Feb 25 '18 at 18:34
  • @DavidC.Ullrich Yeah, it felt a bit too easy. I think I have to handle $\lambda = 0 $ separately. But of course the case $\lambda = 0$ follows at once from the determinant. – lulu Feb 25 '18 at 18:35
  • 1
    @JackD'Aurizio Even over a field where "density" makes sense, the spectrum of the limit need not be the limit of the spectra. Consider $V=\Bbb R^2$, $S_n=\begin{bmatrix}1&1/n\-1/n&1\end{bmatrix}$. – David C. Ullrich Feb 25 '18 at 18:59
  • @DavidC.Ullrich: what is the issue? ${1+\frac{i}{n},1-\frac{i}{n}}$ converges to ${1,1}$. – Jack D'Aurizio Feb 25 '18 at 19:47
  • @JackD'Aurizio $S_n$, as an operator on $\Bbb R^2$, has no eigenvalues. – David C. Ullrich Feb 25 '18 at 20:07
  • @DavidC.Ullrich: there might be some differences about nomenclature here, but aren't eigenvalues elements of the algebraic closure of the underlying field, by definition? – Jack D'Aurizio Feb 25 '18 at 20:13
  • @JackD'Aurizio That's the first time I've seen the word defined that way. The standard definition is that if $T:V\to V$ then an eigenvector is a vector $x\in V$ such that etc. Yes, when people are studying real matricies they often iplicitly regard them as complex matricies, but that does't change the definition. – David C. Ullrich Feb 25 '18 at 20:22
  • @JackD'Aurizio Another issue: Yes, for complex matricies the spectrum of the limit is the limit of the spectra. Do you know an elementary proof of that fact? Whether you regard the example I gave as valid or not it does show, I think, that the proof can't be entirely trivial. (The only proof that springs to my mind uses Rouche's theorem. Never mind thatt the OP said he wanted to avoid determinants and the characterisitc polynomial...) – David C. Ullrich Feb 25 '18 at 20:25
  • @DavidC.Ullrich: hmm, you're right, the roots of a polynomial are continuous functions of its coefficients is a statement which can be proved through Rouché theorem, Newton-Girard formulas or the implicit function theorem, but they all involve a determinant (as a Jacobian or as a by-product of Cramer's rule) at some point. – Jack D'Aurizio Feb 25 '18 at 20:36

2 Answers2

9

Say $\lambda$ is an eigenvalue of $ST$; there exists $x\ne0$ such that $$STx=\lambda x.$$

If you let $y=Tx$ then it follows that $$TSy=\lambda y.$$

No, that's not a proof. Because $TSy=\lambda y$ does not show that $\lambda$ is an eigenvalue of $TS$. Exercise, that you should do before reading on: Why not?

Why not is because we need to know that $y=Tx\ne0$ to conclude that $\lambda$ is an eigenvalue.

The actual proof splits into two cases.

First assume $\lambda\ne0$. Then the argument above is ok: $STx=\lambda x\ne0$, hence $y=Tx\ne0$.

Now assume $0$ is an eigenvalue of $ST$. This says precisely that $ST$ is not invertible. Hence $S$ and $T$ cannot both be invertible, hence (at least in the finite-dimensional case) $TS$ is not invertible, so $0$ is an eigenvalue of $TS$.

(If $TS$ is invertible then $T$ must be surjective and $S$ must be injective; hence in the finite-dimensional case they are both invertible.)

Note We need to assume $V$ has finite dimension or the result is false. Let $V$ be the space of all one-sided sequences $v=(v_1,\dots)$; let $Sv=(v_2,v_3,\dots)$ and $Tv=(0,v_1,v_2,\dots)$. Then $ST$ is the identity but $TS$ has $0$ for an eigenvalue.

  • So TS is not invertible and therefore 0 is an eigenvalue, but what is preventing the case T S 0 = 0 0? How does an eigenvalue of 0 guarantee y is not 0? – ngc1300 Jun 30 '19 at 00:52
  • @pmac Do you know the definition of the word "eigenvalue"? – David C. Ullrich Jun 30 '19 at 01:06
  • Yes. Tv = av when v doesn't equal 0. I am not seeing your point because you are saying assume 0 is an eigenvalue of ST. Ok, so then we have STx = 0, but how does this assure us that Tx isn't 0? Sorry, I am just not seeing it. – ngc1300 Jun 30 '19 at 01:17
  • @pmac I can't figure out what page you're on. Because I don't see anything about $ST=0$ above; asking how I know $Tx\ne0$ would make sense if I were saying $0$ is an eigenvalue of $S$, but I never claimed that. – David C. Ullrich Jun 30 '19 at 13:29
  • @pmac Seriously - I can't figure out what assertion I made you're objecting to. What's the first thing I actually said in that paragraph that you don't see? That paragraph: "Now assume $0$ is an eigenvalue of $ST$. This says precisely that $ST$ is not invertible. Hence $S$ and $T$ cannot both be invertible, hence (at least in the finite-dimensional case) $TS$ is not invertible, so $0$ is an eigenvalue of $TS$." – David C. Ullrich Jun 30 '19 at 13:38
  • Another way to divide the problem: If $Tx \ne 0$, then $\lambda$ is an eigenvalue. Otherwise, $Tx = 0$, so you know specifically that $T$ is not invertible. – Mike Jan 03 '21 at 04:52
0

A proof for matrices is given in this answer.

Since the determinants of the matrices on the left are equal, the determinants on the right are as well. $$ \begin{bmatrix}I_n&-A\\0&\lambda I_m\end{bmatrix} \begin{bmatrix}\lambda I_n&A\\B&I_m\end{bmatrix} =\begin{bmatrix}\lambda I_n-AB&0\\\lambda B&\lambda I_m\end{bmatrix} $$ and $$ \begin{bmatrix}I_n&0\\-B&\lambda I_m\end{bmatrix} \begin{bmatrix}\lambda I_n&A\\B&I_m\end{bmatrix} =\begin{bmatrix}\lambda I_n&A\\0&\lambda I_m-BA\end{bmatrix} $$ This says that $$ \lambda^m\det(\lambda I_n-AB)=\lambda^n\det(\lambda I_m-BA) $$ Thus, for square matrices, the eigenvalues are identical.

robjohn
  • 345,667