1

Linear independence of $e^{at}$ has been answered multiple times. My favorite one is by Marc van Leeuwen in this one: Proof of linear independence of $e^{at}$. The answer uses the property that $e^{at}$ are eigenfunctions of the differentation operation. Now if we instead have an exponential function in two variables, say $e^{ax + by^2}$, it seems to me the linear independence of these functions can too be proved with the same technique, except now using the partial derivative operator:

Proof by induction as in the link except here $e^{a_1x + b_1y^2}$, $e^{a_2x + b_2y^2}$, $...$,$e^{a_{n-1}x + b_{n-1}y^2}$ are assumed linearly independent. As in Marc's proof, we then assume that $e^{a_1x + b_1y^2}$, $e^{a_2x + b_2y^2}$, $...$,$e^{a_{n}x + b_{n}y^2}$ are in turn dependent and thus have:

$e^{a_{n}x + b_{n}y^2}= c_1e^{a_1x + b_1y^2} + c_2e^{a_2x + b_2y^2} + ... + c_{n-1}e^{a_{n-1}x + b_{n-1}y^2}$. Applying operator $\frac{\partial}{\partial x} - a_nI$ will give $0= c_1(a_1-a_n)e^{a_1x + b_1y^2} + c_2(a_2-a_n)e^{a_2x + b_2y^2} + ... + c_{n-1}(a_{n-1}-a_n)e^{a_{n-1}x + b_{n-1}y^2}$ which would thus require all c to be zero, which essentially completes the induction proof (other argumentation as per link).

My question is: why does $y^2$ not seem to have any effect on the linear independence. Where does this stem from?

edit: assume all $a_k$ and $b_k$ are distinct.

Lulu
  • 143
  • You have to take in consideration that the linear independence you present, is towards variable $x$, judging from the operator you are using. A 2 variable function $f(x,y)$ would need a 2-dimension operator. So what your assumption is that $e^{ax+bx^2}$ is linear towards $x$ which is true (its like considering $bx^2$ a constant) – Pookaros May 30 '19 at 16:12
  • I suspected something like this (assume you meant $by^2$). I guess $\frac{\partial }{\partial x \partial y} $is then what youd sugggest – Lulu May 30 '19 at 16:16
  • also, despite seeing your point on intuitive level, I fail to see the logic fully since I dont see which part of my argumentation fails in my question. After all $a_1f_1 + a_2f_2 + ... + a_nf_n = 0$ implies linear independence if all a_i are 0, no matter what the f_1 are (no matter how many variables e.g. second answer here http://mathhelpforum.com/calculus/220785-linearly-independent-multivariable-function.html) – Lulu May 30 '19 at 16:22
  • Note that your functions are products: $e^{a_jx} e^{b_jy^2}$. If a linear combination $\sum c_j e^{a_jx} e^{b_jy^2}$ is zero then by fixing $y$ it follows that a linear combination of the $e^{a_jx}$ is zero. – Martin R May 30 '19 at 16:28
  • Good point. Am I right saying that then it would suffice to prove separately for each variable i.e. using partial differentiation wrt x and y each at turn – Lulu May 30 '19 at 16:42
  • The false in your logic is that you are using the operator ,which is used "conditionally" on the proof, to prove something completely different. This operator is used a mathematical "trick" to annihilate the final exponential function and thus leaving us only the coefficients we need to prove independency. It works only on $e^{a_{n}x}$. Your misconception stands on the fact that you used this operator, which will annihilate the final exponential function and thus leave again only the coefficients of the one variable. Proving the independency of only variable $x$. – Pookaros May 30 '19 at 16:50
  • Great, thanks!! – Lulu May 30 '19 at 17:10
  • The error in your proof comes from claiming that

    $$\tag 1 \sum_{k=1}^{n-1}c_k(a_k-a_n)e^{a_kx+b_ky^2}\equiv 0$$

    implies all $c_k=0.$ It doesn't; it implies all $c_k(a_k-a_n)=0.$ Since $a_k=a_n$ is a possibility, we can't deduce $c_k=0$ for such a $k.$

    – zhw. May 30 '19 at 21:25
  • that's true, but I implicitly assumed all $a_k$ are different from each other as this is what was assumed in the question I linked to. I should have stated that explicitly though. I have edited accordingly. – Lulu May 31 '19 at 06:16

1 Answers1

1

Your proof essentially repeats the proof that the functions $e^{a_j x}$ are linearly independent. The “$y$-terms” have “no effect” because they occur as (non-zero) factors $e^{b_j y^2}$ which are constant with respect to $x$.

What you observed is this: If $g_1, \ldots, g_n : A \to \Bbb R$ and $h_1, \ldots, h_n : B \to \Bbb R$ are functions such that

  • $g_1, \ldots, g_n$ are linearly independent, and
  • there is a $y_0 \in B$ such that $h_j(y_0) \ne 0$ for all $j$,

then the functions $f_j : A \times B \to \Bbb R$, $f_j(x, y) = g_j(x) h_j(y)$, $j=1, \ldots, n$, are linearly independent.

The proof is straight forward: If $c_1, \ldots, c_n \in \Bbb R$ with $$ \sum_{j=1}^n c_j g_j(x) h_j(y) = 0 \text{ for all } (x, y) \in A \times B $$ then in particular $$ \sum_{j=1}^n \bigl( c_j h_j(y_0) \bigr) g_j(x) = 0 \text{ for all } x \in A $$ Since the $g_j$ are linearly independent it follows that $$ c_j h_j(y_0) = 0\text{ for } j = 1, \ldots, n \implies c_j = 0 \text{ for } j = 1, \ldots, n \,. $$

In your case $g_j(x) = e^{a_j x}$ and $h_j(y) = e^{b_j y^2}$.

Martin R
  • 113,040
  • Ah amazing! So actually my proof was then "correct" (which I thought it wasnt based on Pookaros' comments above). I suppose his/her comment would be valid if x and y had a more complex relationship? – Lulu May 30 '19 at 19:40