2

Let $\varphi,\psi$ be endomorphisms on a finite-dimensional vector space over a field $K$ such that $\varphi\circ\psi=\psi\circ\varphi$. Assume $V$ has a basis of eigenvectors of $\psi$ and a basis of eigenvectors of $\varphi$. Prove that $V$ has a basis consisting of vectors that are eigenvectors of both $\varphi$ and $\psi$.

My work so far:

Let $\lambda_1,\ldots,\lambda_n$ be the distinct eigenvalues of $\varphi$. For each $\lambda_i$, let $c_{i,1},\ldots,c_{i,m_i}$ be the eigenvectors of $\varphi$ corresponding to $\lambda_i$. By assumption, $\varphi$ is diagonalizable, so we can write $$ V\cong V_1\oplus\cdots\oplus V_n $$ where each $V_i=\text{eig}(\varphi,\lambda_i)$. Using the fact that $\varphi$ and $\psi$ commute, it is easy to show that for all $v_i\in V_i$, $\psi(v_i)$ is an eigenvector of $\varphi$ corresponding to $\lambda_i$, so $\psi(v_i)\in V_i$. Apparently, I need this fact in order to show that since $\psi$ is also diagonalizable by assumption, we can find a basis $\{d_{i,1},\ldots,d_{i,m_i}\}$ for each $V_i$ consisting of eigenvectors of $\psi$. From here it follows that each $d_{i,j}$ is an eigenvector of $\varphi$, since $\varphi(d_{i,j})=\lambda_id_{i,j}$ ($\varphi$ acts as multiplication by $\lambda_i$ on $V_i$).

My question: I know that $\psi$ being diagonalizable implies I can decompose $V$ into eigenspaces corresponding to $\psi$, but how do I know I can find a basis of eigenvectors of $\psi$ for each $V_i$? Like I said before, I believe this comes from the fact that $\psi(V_i)\subset V_i$, but I am not sure. Any ideas?

morrowmh
  • 3,036

2 Answers2

1

Let's suppose that $\phi,\psi$ commute and let's prove that exists a common basis of eigenvectors.

Let $\lambda_{1},\cdots,\lambda_{k}$ di eigenvectors of $\phi$ and $E(\lambda_{i},\phi)$ the relative eigenspaces. Since $\phi \circ \psi = \psi \circ \phi$ we get that $\psi(E(\lambda_{i},\phi)) \subseteq E(\lambda_{i},\phi) \hspace{0.1cm} i = 1,\cdots k $. Infact, $\forall x \in E(\lambda_{i},\phi)$ we have

$$\phi(\psi(x)) = \psi(\phi(x)) = \phi(\lambda_{i}x) = \lambda_{i}(\phi(x))$$

So $\psi(x) \in E(\lambda_{i},\phi)$.

Beeing $\phi$ diagonalizable we have that

$$V = E(\lambda_{1},\phi) \oplus \cdots \oplus E(\lambda_{k},\phi)$$

Let $w$ an eigenvector for $\psi$ with eigenvalue $\mu$. Thanks to the decomposition we can write $w$ in a unique way $w = x_{1}+\cdots +x_{k}$,$x_{i} \in E(\lambda_{i},\phi)$ so

$$\mu w = \mu x_{1}+\cdots +\mu x_{k}$$

But $\mu w = \psi(w)$ so we have also

$$\psi(w) = \psi(x_{1})+\cdots +\psi(x_{k})$$

With $\psi(x_{i}) \in E(\lambda_{i},\phi)$ thanks to the invariance dicussed above. Since the decomposition to the space in eigenspaces is unique, it must follow that

$$\psi(x_{1}) = \mu x_{1},\cdots \psi(x_{k}) = \mu x_{k} $$

And the $x_{i}$ not all equals to the null vector (since $w$ is an eigenvector). But if you look at the $x_{i}$ you'll notice that they are exactly eigenvectors in common for $\phi,\psi$ since $x_{i} \in E(\lambda_{i},\phi)$ and $\psi(x_{i}) = \mu x_{i}$.

If you apply the same argument of this proof starting from a basis $w_{1},\cdots,w_{n}$ of eigenvectors for $\psi$ given by hyphtesis, we get a set of at least $n$ coommon eigenvectors which generates $V$ because $w_{i}$ are wrote as a linear combination of those, hence they generate; and you can just simply extract a basis from them.

jacopoburelli
  • 5,564
  • 3
  • 12
  • 32
0

This is actually really simple, my friend and I just figured it out. Since $\psi$ is diagonalizable, and each $V_i$ is $\psi$-invariant, it is actually a known theorem that $\psi$ restricted to each $V_i$ is also diagonalizable. The result follows. For the proof of this theorem, see here.

morrowmh
  • 3,036