6

Let $T: V \rightarrow V$ be a linear operator.

I need to demonstrate that if all nonzero vectors of $V$ are eigenvectors of $T$, then there is one specific $\lambda \in K$ such that $T(v) = \lambda v$, for all $v \in V$.

I understand that, if all nonzero vectors of $V$ are eigenvectors of $T$, then $T$ must be a scaling transformation. It just stretch or shrinks vectors, but doesn't change their directions.

So, the statement says that if it happens, then, there is a single $\lambda$ such that $T(v) = \lambda v$. In other words, if there is such transformation, then it scales all vectors by the same scalar $\lambda$.

Applying the transformation to our standard basis vectors, we have: $$ T(e_1) = \lambda_1 e_1 \\ T(e_2) = \lambda_2 e_2 \\ \vdots \\ T(e_n) = \lambda_n e_n $$

I understand I need to prove that $\lambda_1 = \lambda_2 = \dots = \lambda_n$, but I can't see how!

EDIT

$$ v = c_1e_1 + c_2e_2 + \dots + c_ne_n \\ T(v) = \mu v = \lambda_1c_1e_1 + \lambda_2c_2e_2 + \dots + \lambda_nc_ne_n \\ $$

Since what's multiplying $v$ coordinates is $\lambda_i$, then all of them must be $\mu$. I'm not sure how to 'mathematize' this. Is this idea correct?

EDIT 2 Extending the left hand side of EDIT 1, we have: $$ \mu v = \lambda_1c_1e_1 + \dots + \lambda_nc_ne_n \\ \mu(c_1e_1 + \dots + c_ne_n) = \lambda_1c_1e_1 + \dots + \lambda_nc_ne_n \\ \mu c_1e_1 + \dots + \mu c_ne_n = \lambda_1c_1e_1 + \dots + \lambda_nc_ne_n \\ $$

And since $e_i$ are linearly independent, $\mu = \lambda_1 = \lambda_2 = \dots = \lambda_n$. Is this proof correct?

  • 2
    Hint: consider $e_1+e_2$. – Chris Eagle May 17 '12 at 18:23
  • I believe I've come to something. If ALL vectors of V are eigenvectors, then $T(v)$ is also an eigenvector, since T maps $V \rightarrow V$. Then it can be written as $\mu v$, where $\mu$ is another eigenvalue. Than $\mu = \lambda_1 = \lambda_2 = \dots = \lambda_n$. Is it correct? – João Daniel May 17 '12 at 18:28
  • Your edit shows that $\mu = \lambda_i c_i$, $\forall i$, since the $e_i$ are linearly independent. – copper.hat May 17 '12 at 18:40
  • But $\mu$ is a scalar multiplying $v$, and $c_i$ are $v$ coordinates on standard basis, i.e., $c_ie_i$ are $v$ coordinates. If $\mu$ is multiplying $v$, why does $c_i$ comes to $\mu$ equation? – João Daniel May 17 '12 at 18:44
  • You have now asked THREE different questions in the same post. What happens to the answers to your FIRST question, posted while this was the only one? – Did May 17 '12 at 19:13
  • I'm honestly uncertain on whether should I ask a new question or edit it. But answering your question, I'm evaluating it based on my first question. Later edits were directed to discussion. – João Daniel May 17 '12 at 19:17
  • @JoãoDaniel: Your edit has the correct conclusion. In the basis $e_1,...e_n$, the coordinates of $v$ are $(c_1,...,c_n)$ and the coordinates of $Tv = \mu v$ are $(\mu c_1,...,\mu c_n) = (\lambda_1 c_1,...,\lambda_n c_n)$. It follows that $\mu c_i = \lambda_i c_i$, choosing $c_i=1$ gives the required result. – copper.hat May 17 '12 at 19:31

3 Answers3

13

Since all $v\in V$ are eigenvectors, we can choose $e_i$, the $i$th unit vector. Then by assumption we have $T e_i = \lambda_i e_i$ for some $\lambda_i$. It follows that $T$ is diagonal, with elements $\lambda_1,...,\lambda_n$ on the diagonal.

Now choose $v=e_1+...+e_n$, again for some $\lambda$, we have $Tv=\lambda v$, so we have $$T v = T(e_1+...+e_n) = \lambda_1 e_n +... + \lambda_n e_n = \lambda (e_1+...+e_n).$$ Since the $e_i$ are linearly independent, it follows that $\lambda = \lambda_1 = ... = \lambda_n$. Hence $Tx = \lambda x$, $\forall x$.

copper.hat
  • 172,524
  • 1
    When you write $v = e_1 + \dots + e_n$, you're considering it for unit vectors. Shouldn't you consider $v = c_1e_1 + \dots + c_ne_n$ with $c_i \in K$, in order to extend it to all vectors of $V$? – João Daniel May 17 '12 at 18:51
  • 2
    What needs to be extended? If you know $T$ on a basis, you know $T$ everywhere, by linearity. First I show $T$ is diagonal, since $T(\sum x_i e_i) = \sum \lambda_i x_i e_i$, then I show that all diagonal elements are equal. – copper.hat May 17 '12 at 19:02
  • Sorry, I got confused mixing it with my try of demonstration. I'm still trying to assimilate your answer ;). – João Daniel May 17 '12 at 19:11
  • 4
    This is a very nice proof. +1 – Michael Joyce May 17 '12 at 19:43
  • @copper.hat $v$ is just a specific vector that sets $v = e_1+...+e_n$, you shouldn't show that it works for all $u \in V$? – Avishay28 Sep 25 '17 at 07:46
  • @Avishay28: No, the point of the $e_1 + \cdots + e_n$ trick is to show that $\lambda = \lambda_1 = \cdots = \lambda_n$. Any set of linearly independent vectors would work. – copper.hat Sep 25 '17 at 13:44
3

Assume for a contradiction that $v,w$ are eigenvectors for $\lambda\neq\mu$, respectively, (in particular they are nonzero) and that $T(v+w)=\nu(v+w)$ for some $\nu\in K$. Since any nonzero scalar multiple of an eigenvector is an eigenvector for the same eigenvalue, $v$ and $w$ cannot be linearly dependent. Then by this linear independence $\nu v+\nu w=T(v+w)=T(v)+T(w)=\lambda v+\mu w$ implies $(\nu,\nu)=(\lambda,\mu)$, which contradicts $\lambda\neq\mu$.

So if all nonzero vectors are eigenvectors, then all of them must be so for the same eigenvalue$~\lambda$, and one has $T=\lambda I$. (Pedantically, if $\dim V=0$ there are no eigenvectors at all, and one is free to choose$~\lambda$; "specific" in the question is not justified in this case.)

1

Hint: Assume that $Tu=\lambda u$ and $Tv=\mu v$ for some nonzero vectors $u$ and $v$ and some $\lambda$ and $\mu$.

  • Show that $\{u,v\}$ is a linearly independent family.
  • Show that $\{u+v,au+bv\}$ is a linearly independent family, for every $a\ne b$.
  • Show that $\{T(u+v),u+v\}$ is not linearly independent.
  • Conclude that $\lambda=\mu$.
Did
  • 279,727