15

Basically my question is - How to check for linear independence between functions ?!

Let the group $\mathcal{F}(\mathbb{R},\mathbb{R})$ Be a group of real valued fnctions.

i.e $\mathcal{F}(\mathbb{R},\mathbb{R})=\left\{ f:\mathbb{R}\rightarrow\mathbb{R}\right\} $

Let 3 functions $f_{1},f_{2},f_{3}$ be given such that

$\forall x\in\mathbb{R}\,\,\,f_{1}=e^{x},\,\,f_{2}=e^{2x},\,\,f_{3}=e^{3x}$

$W=sp(f_{1},f_{2},f_{3})$ what is $dim(W)$ ?

How to approach this question ? (from a linear algebra perspective)

I know that $\forall x\in\mathbb{R}\,\,\,W=\alpha e^{x}+\beta e^{2x}+\gamma e^{3x}$

And to get the dimension I need to find the base of $W$

so I need to check whether the following holds true :

$\forall x\in\mathbb{R}\,\,\alpha e^{x}+\beta e^{2x}+\gamma e^{3x}=0\,\Leftrightarrow\,\alpha,\beta,\gamma=0$

However when $x=0$ I get $\alpha+\beta+\gamma=0$ which leads to infinite amount of solutions.

How to approach this question ?

7 Answers7

14

You need to check if the functions are independent, as you said.

A way to go about this, which that ties it in with things you likely know is to evaluate it at several points, as you did for $x=0$.

You get one condition for $x=0$. You get another condition for $x=1$ and still another one for $x=2$.

Each will allow more than one solution, but they'll only have one common solution, which is what you are after.

quid
  • 42,135
  • $$\begin{array}{cc} x=0 & \alpha+\beta+\gamma=0\ x=1 & \alpha e+\beta e^{2}+\gamma e^{3}=0\ x=2 & \alpha e^{2}+\beta e^{4}+\gamma e^{6}=0 \end{array}$$

    $\rightarrow\begin{bmatrix}1 & 1 & 1\ e & e^{2} & e^{3}\ e^{2} & e^{4} & e^{6} \end{bmatrix}\rightarrow$ I found this matrix, the determinant is not zero thus there is only 1 solution which means that $\alpha,\beta,\gamma=0$ for $x=1,2,3$ how does that helps ?

    – Pavel Penshin Apr 27 '16 at 19:54
  • 3
    Note that you need $\alpha, \beta, \gamma$ that work for all $x$ at the same time (they must not depend on $x$). You just showed that for $\alpha, \beta, \gamma$ to work for $x=0,1,2$ you already only have the unique choice all $0$. So you are done. – quid Apr 27 '16 at 20:23
12

Write $$\alpha e^x + \beta e^{2x} + \gamma e^{3x} = 0$$ You can go ahead and cancel out a positive number like $e^x$ so: $$\alpha + \beta e^{x} + \gamma e^{2x} = 0$$ Suppose you have some solution for this with $\alpha$, $\beta$, $\gamma$ not all zero. Then, as you say $$ \alpha + \beta + \gamma = 0\qquad \qquad (1)$$ Because this must be true at $x = 0$ but it must also be true at $x = \ln n$ which gives: $$ \alpha + \beta n + \gamma n^2= 0\qquad \qquad (2)$$ for every $n > 1$. It should be clear that this is unsolvable except when they are all zero. But to press the point I'll continue. Substituting in $(1)$ gives $\alpha = -\beta - \gamma$, which we can plug into $(2)$ to get $$ \beta (n-1) + \gamma (n^2 - 1)= 0$$ which must be true for all $n > 1$. Now put, say, $n = 2$ and $n = 3$ to get the pair of equations: $$ \beta + 3 \gamma = 0 \qquad 2\beta + 8\gamma = 0 $$ This solves for $\beta = \gamma = 0$.

So your functions are proved to be linearly independent.

amcalde
  • 4,674
7

Hint:

let $e^x=y$, $e^{2x}=y^2$, $e^{3x}=y^3$ you have:

$\alpha y +\beta y^2+ \gamma y^3=0$

where the $0$ at RHS is the zero polynomial.

Now: when a polynomial is the zero polynomial?

In general:

The $0$ at RHS is the neutral element for the sum of functions in the vector space, not simply the number $0$ and this means that it is the function $f(x)=0\quad \forall x \in \mathbb{R}$.

Emilio Novati
  • 62,675
  • $$\begin{array}{cc} x=0 & \alpha+\beta+\gamma=0\ x=1 & \alpha e+\beta e^{2}+\gamma e^{3}=0\ x=2 & \alpha e^{2}+\beta e^{4}+\gamma e^{6}=0 \end{array}$$

    $\rightarrow\begin{bmatrix}1 & 1 & 1\ e & e^{2} & e^{3}\ e^{2} & e^{4} & e^{6} \end{bmatrix}\rightarrow$ I found this matrix, the determinant is not zero thus there is only 1 solution which means that $\alpha,\beta,\gamma=0$ for $x=1,2,3$ how does that helps ? what is RHS ? googling didnt help :(

    – Pavel Penshin Apr 27 '16 at 19:57
  • The key fact is that in $\alpha f_1+\beta f_2+\gamma f_3=0$ The $0$ is the zero function i.e. a function that is null for all values of $x$ in the domain. Your linear system shows that you can find values for $\alpha, \beta, \gamma$ such that $\alpha f_1+\beta f_2+\gamma f_3=0$ is true for some value of $x$ but not for all the possible values. – Emilio Novati Apr 27 '16 at 20:06
5

You have to prove $$ \forall x\in\mathbb{R}:\alpha e^{x}+\beta e^{2x}+\gamma e^{3x}=0\Leftrightarrow\alpha,\beta,\gamma=0, $$

but I think the quantifier applies only to the part on the left side of the $\Leftrightarrow$, like this: $$ \left(\forall x\in\mathbb{R}:\alpha e^{x}+\beta e^{2x}+\gamma e^{3x}=0\right) \Leftrightarrow\,\alpha,\beta,\gamma=0. $$

So for example $\alpha = -1, \beta = 1, \gamma = 0$ satisfies $\alpha e^{x}+\beta e^{2x}+\gamma e^{3x}=0$ when $x=0$, but it doesn't satisfy the equation for all values of $x$.

If you had to prove $$ \forall x\in\mathbb{R}:\left(\alpha e^{x}+\beta e^{2x}+\gamma e^{3x}=0 \Leftrightarrow\,\alpha,\beta,\gamma=0\right) $$ then you would be in trouble, because that statement is not true; but that's not how we prove independence of the functions, so you don't need to worry about that.

David K
  • 98,388
5

Hint: Use Wronskian and show that the Wronskian-Determinant does not vansish.

MrYouMath
  • 15,833
5

You need to show the three vectors are linearly independent. In this case I would use this trick; so that you don't need to worry about them being functions and the equality to hold for every value of $x$.

If you consider $D: \mathcal{F} \rightarrow \mathcal{F}$, the derivative operator, is an endomorphism in $\mathcal F$ (i.e. a linear map from $\mathcal{F}$ to itself). The derivatives of the three functions are $$ Df_1=De^x=e^x=f_1 $$ $$ Df_2=De^{2x}=2e^{2x}=2f_2 $$ $$ Df_3=De^{3x}=3e^{3x}=3f_3 $$ So $f_1,f_2,f_3$ are eigenvectors of $D$, with eigenvalues $\lambda_1=1, \lambda_2=2, \lambda_3=3$, respectively. Since $f_1, f_2, f_3$ are eigenvectors with distinct eigenvalues of the same endomorphism $D$, they are linearly independent so they form a base for $W$ and $\text{dim}W=3$.

Rnhmjoj
  • 567
4

If you have $$ \alpha e^{x} + \beta e^{2x} + \gamma e^{3x} \equiv 0, $$ Then you can apply the derivative operator $D$ to obtain \begin{align} 0 & \equiv (D-2)(D-3)\{\alpha e^{x} + \beta e^{2x} + \gamma e^{3x}\} \\ & = (1-2)(1-3)\alpha e^{x}. \end{align} Therefore $\alpha=0$. Then you can apply $(D-1)(D-3)$ in order to conclude that $\beta=0$. Similarly $\gamma =0$. So $\{ e^x,e^{2x},e^{3x} \}$ is a linearly independent set of functions, which means that the dimension of $W$ is $3$.

Disintegrating By Parts
  • 87,459
  • 5
  • 65
  • 149
  • Thanks for your answer I did not quite understand your notation for the derivative operator. also, can this answer be obtained in another way ? (without Wronskian as well ) – Pavel Penshin Apr 27 '16 at 21:16
  • @user313448 Have you studied differential equations where they use the annihilator method? That's what I'm using. If not, you can do this with limits. Multiply by $e^{-3x}$ and let $x\rightarrow\infty$ in order to obtain $\gamma =0$. Then you can isolate $\beta=0$ and, finally, you isolate $\alpha=0$ with no limits. – Disintegrating By Parts Apr 27 '16 at 21:34