2

Okay, so I know that a basis consists of linearly independent vectors. But when you're talking about multiple bases for multiple eigenspaces of a matrix, I want to know if all vectors in all these bases are linearly independent? For example, if a basis for an eigenspace of a matrix A is B = {v1, v2), and a basis for another eigenspace of A is B' = {w1}, are v1, v2, and w1 necessarily linearly independent?

I'm trying to understand a few concepts regarding eigenspaces in my book, and it seems like if this is true it would make a few things I'm confused on make sense. So I'm just wondering if this is, in fact, the case?

dagny
  • 641
  • Yes, they are. Can you see why eigenvectors with different eigenvalues cannot be scaler multiples of each other? Try to build on this. – Qudit Mar 02 '17 at 06:51
  • But what if a basis for one eigenspace is {v}, and a basis for another eigenspace is {w}, couldn't a basis for another eigenspace be {v+w}? – dagny Mar 02 '17 at 06:58
  • 1
    Write down a linear combination and apply $A$ to it. What happens to the coefficients? – Hagen von Eitzen Mar 02 '17 at 06:59
  • I don't know? I mean...wouldn't something different happen for every linear combination and every A? Can you tell me what conclusion I'm supposed to draw? – dagny Mar 02 '17 at 07:09
  • The most convenient formulation of this result is that any collection of eigenspaces (of the same linear operator) for distinct eigenvalues always form a direct sum of subspaces. It then follows that taking the union of any choice of bases in each of the subspaces gives a basis of their sum, therefore a linearly independent set. – Marc van Leeuwen Mar 02 '17 at 07:40
  • Well,I thought I would accept one of the answers since I figured out the question, and they all helped me understand...but I guess I'll unaccept it? – dagny Mar 02 '17 at 08:15

2 Answers2

2

Yes, they are linearly independent.

Be $B_1,\ldots,B_n$, $B_k=\{v_{k1},\ldots,v_{k|B_k|}\}$, the bases of distinct eigenspaces $V_k$ of $A$, and $\lambda_1,\ldots,\lambda_n$ the corresponding eigenvectors. Obviously $\lambda_i=\lambda_k\iff i=k$, or else the eigenspaces would not be distinct.

Now assume the vectors in $B_1\cup\dots\cup B_n$ were linearly dependent. Then you'd have numbers $\alpha_{kl}$ such that $$\sum_{k=1}^n\sum_{l=1}^{|B_k|}\alpha_{kl}v_{kl}=0\,. \tag{1}$$ To simplify notation, we define $$w_k = \sum_{l=1}^{|B_k|}\alpha_{kl}v_{kl}\,.\tag{2}$$ With this the equation $(1)$ simplifies to $$\sum_{k=1}^n w_k = 0\,.$$ Obviously $w_k\in V_k$. Therefore we get $$A\sum_{k=1}^n w_k = \sum_{k=1}^n \lambda_k w_k=0\,. \tag{3}$$

Now we consider two cases for $\lambda_n$. The first case is that $\lambda_n=0$. Then obviously in equation $(3)$ the term $k=n$ vanishes and therefore can be omitted. That is, in that case, $B_1\cup\dots\cup B_{n-1}$ is already linearly dependent.

Now consider $\lambda_n\ne 0$. Then we can divide $(3)$ by $\lambda_n$ and subtract $(2)$ to obtain $$\sum_{k=1}^n\left(\frac{\lambda_k}{\lambda_n}-1\right)w_k = 0\,.$$ Again, we see that the coefficient of $w_n$ vanishes, therefore already $B_1\cup\dots\cup B_n$ is linearly dependent.

So we have shown that if the basis vectors of a set of eigenspaces are linearly dependent, then we can remove one of the eigenspaces and still have a linearly dependent set. But we can do the same step with the new set, until we arrive at just one eigenspace, for which we again have to conclude linear dependence. But we chose a basis for the eigenspace, so that cannot be linearly dependent. Therefore our original assumption must have been wrong, and $B_1\cup\dots\cup B_n$ indeed is linearly independent.

celtschk
  • 43,384
1

Hint: If $B$ and $B'$ are bases for subspaces $W$ and $W'$ respectively, then $B \cup B'$ is a linearly independent set if and only if $W \cap W' = \{0\}$.


Edit: to be clear, you should prove the above claim first.

But once we know the claim is true, it remains to check that two eigenspaces [for different eigenvalues] have intersection $\{0\}$.

If this were not true, then there would exist a vector $v \ne 0$ that is both a $\lambda$-eigenvector and a $\lambda'$-eigenvector where $\lambda \ne \lambda'$. Is this possible?


As pointed out in the comments, this only works for two eigenspaces at a time (and induction would require more properties than the 2 eigenspace case). The linked duplicate question provides more detail. Apologies for my mistake.

angryavian
  • 89,882
  • I think you're implying that eigenspaces can't intersect anywhere other than zero? So my new question is why is that the case? – dagny Mar 02 '17 at 07:12
  • 1
    @dagny: If a vector $v\ne0$ is in two different eigenspaces, what would that imply for $Av$? – celtschk Mar 02 '17 at 07:20
  • 1
    The argument given here (checking linear independence by computing the intersection of the subspaces) only applies to two eigenspaces at a time. It cannot therefore be used to prove that any number of eigenspaces for distinct eigenvalues form a direct sum. – Marc van Leeuwen Mar 02 '17 at 07:42
  • In fact the accepted answer to the question I just marked this one as duplicate of precisely addresses the point of my previous comment. – Marc van Leeuwen Mar 02 '17 at 07:50
  • @MarcvanLeeuwen Thanks for pointing this out. I will ask the OP to un-accept my answer. – angryavian Mar 02 '17 at 07:51
  • @dagny Please un-accept my answer, which is incorrect. – angryavian Mar 02 '17 at 07:51
  • @angryavian Why not fix it instead? Your answer could still be the base case of an inductive proof. – Qudit Mar 02 '17 at 08:10