15

I have $n$ lambdas, which are all different real and positive numbers, where: $\lambda_1 < \lambda_2 < \cdots < \lambda_n$.

I then have to show that these functions are linearly independent:

$$e^{\lambda_1 t}, e^{\lambda_2 t}, \ldots, e^{\lambda_n t}$$

So I guess what I want to show is that the only solution to this equations:

$$c_1 e^{\lambda_1 t} + c_2 e^{\lambda_2 t}+ \ldots +c_n e^{\lambda_n t}=0, \quad \text{for all } t$$

is that all constants are zero. I am not entirely sure how to do this when it is for $n$ lambdas - I did it earlier for just 3 lambdas, where I differentiated the function, so I have tried to use the same approach.

I wanted to get $n$ equations with $n$ unknown, so I differentiated the function $n-1$ times, and then I chose to look at the case where $t = 0$, so I have something looking like this:

$$c_1+c_2+ \ldots +c_n = 0$$

$$c_1 \lambda_1+c_2 \lambda_2+ \ldots +c_n \lambda_n = 0$$

$$ \cdots $$

$$c_1 \lambda_1^{n-1}+c_2 \lambda_2^{n-1}+ \ldots +\lambda_n^{n-1} = 0$$

Then I put this into a matrix $Ac = 0$, and now I want to show that the $\det(A)$ doesn't equal zero, so that the only solution is c = 0, but I'm not quite sure how to do this, or whether there is an easier way to do it ?

some_name
  • 347
  • 8
    The matrix A is a famous vandermonde matrix and its determinant will be non-zero since all $\lambda_i$ are distinct. Here https://en.wikipedia.org/wiki/Vandermonde_matrix – random123 Sep 25 '15 at 17:48
  • Are you familiar with the "Wronskian?" https://en.m.wikipedia.org/wiki/Wronskian – ClassicStyle Sep 26 '15 at 02:16

4 Answers4

30

Define $f_j \colon \mathbf{R} \to \mathbf{R}$ by $$ f_j(t) = e^{\lambda_j t}.$$ Let $V$ denote the vector space of infinitely-differentiable functions from $\mathbf{R}$ to $\mathbf{R}$. Define $D \colon V \to V$ by $$Df = f'.$$ Note that $$Df_j = \lambda_j f_j$$ for $j = 1, \dots n$. Thus $f_1, \dots, f_n$ is a list of eigenvectors of $D$ corresponding to distinct eigenvalues. Now use the theorem that a list of eigenvectors corresponding to distinct eigenvalues is linearly independent, completing the proof that $f_1, \dots, f_n$ is a linearly independent list.

Sheldon Axler
  • 6,226
  • 1
  • 18
  • 25
  • 10
    Oh my god are you "the" Axler who wrote my Linear Algebra book? I really love your text!! (Sorry if this is a bit inappropriate for this site but couldn't help myself!) – Landon Carter Sep 26 '15 at 03:07
  • 9
    Many thanks for your comment, Landon, about my book Linear Algebra Done Right. – Sheldon Axler Sep 26 '15 at 05:53
  • In contrast to the OP's question, this proof did not depend on the $\lambda_j$s being ordered or positive. They just had to be distinct. (In fact, the $\lambda_j$s need not even be real, though this proof's notation would need to be adjusted slightly to account for that.) Finally, the theorem to which @SheldonAxler appeals is Theorem 5.6 on p. 79 in the second edition of his book, Linear Algebra Done Right (1997). – wkschwartz Aug 10 '21 at 18:47
3

Hint: One way to proceed is to show that $\lim_{t \to \infty} |e^{t \lambda_n} / (\sum_{j=1}^{n-1} c_j e^{t \lambda_j})| = \infty$. If you could come up with a linear combination of all the $e^{t \lambda_j}$ with non-zero coefficients that is $0$, then the limit should be a constant.

user2566092
  • 26,142
2

You can also use induction on $n$. If $n=1$ this is trivial, so assume $n>1$ and that the result holds for each integer less than $n$. Define $f(t)=\sum_{j=1}^nc_je^{\lambda_jt}$, and assume $f\equiv0$. Then $$0=f'(t)-\lambda_if(t)=\sum_{j=1}^n(\lambda_j-\lambda_i)c_je^{\lambda_jt}=\sum_{j\neq i}(\lambda_j-\lambda_i)c_je^{\lambda_jt}$$ for all $t\in\mathbb R$. Now this sum has $n-1$ terms, so by induction hypothesis $(\lambda_j-\lambda_i)c_j=0$ for all $i\neq j$. But $\lambda_j\neq\lambda_i$ if $j\neq i$, so $c_j=0$ for all $j\neq i$. Hence $c_ie^{\lambda_it}=0$ which of course implies $c_i=0$, and the result is proved.

It should be noted that the idea here is the same as Sheldon Axler's, and more or less contains a proof that eigenvectors of distinct eigenvalues are linearly independent.

Jason
  • 15,438
-1

I'm not sure but this matrix looks like the transpose of the Vandermonde matrix

fmeyer
  • 300