Assume that I have N Gaussian functions with different means $\mu_i$ and variances $\beta_i$, How to prove $e^{-\beta_i(x-u_i)^2}$ are linear independent? 1$\le$i$\le$N
-
Since orthogonality condition implies linear independency, check their inner product. – Math.StackExchange Dec 26 '14 at 12:40
-
2@AranKomatsuzaki Aren't the inner products all positive (as all $f_i$ are positive)? - I'd rather suggest to have a look at what a inear dependency implies for the derivatives – Hagen von Eitzen Dec 26 '14 at 12:43
-
Oh, that's true, and your suggestion sounds really nice! – Math.StackExchange Dec 26 '14 at 12:47
2 Answers
I expect that the linear independence can be proved via asymptotic arguments by the following plan. Suppose that there exist non-zero constants $\lambda_1,\dots,\lambda_N$ such that $$f(x)=\sum_{i=1}^N \lambda_i e^{-\beta_i(x-u_i)^2}=0.$$ Let $\beta =\min\{\beta_1,\beta_2,\beta_N\}$ and $$g(x)=\sum_{\beta_i=\beta} \lambda_i e^{-\beta_i(x-u_i)^2}.$$
Considering the asymptotics of the function $e^{\beta x^2}f(x)=e^{\beta x^2}(g(x)+o(g(x))$ when $x\to\infty$, we see that $e^{\beta x^2}g(x)=0$. Next, $$e^{\beta x^2}g(x)=\sum_{\beta_i=\beta} \lambda_i e^{-\beta u_i^2} e^{2\beta_i u_i x}.$$ So, again considering the asymptotics of the function $e^{\beta x^2}g(x)$ when $x\to\infty$, we see that all $\lambda_i e^{-\beta_i u_i^2}$ are zeroes.
PS. I expect that the linear independence of the family of functions $\{e^{2\beta_i u_i x}\}\equiv\{e^{\mu_1 x}, e^{\mu_2 x},\dots, e^{\mu_k x}\}$ can also be proved using the Vandermonde determinant
$$\left|\begin{array}{}
e^{\mu_1 0} & e^{\mu_2 0} & \dots e^{\mu_k 0}\\
e^{\mu_1 1} & e^{\mu_2 1} & \dots e^{\mu_k 1}\\
\dots & \dots & \dots \\
e^{\mu_1 (k-1)} & e^{\mu_2 (k-1)} & \dots e^{\mu_k (k-1)}\\
\end{array}
\right|
=\prod_{1\le i<j\le k} (e^{\mu_i}- e^{\mu_j})\ne 0.$$

- 90,434
-
-
2@PedroTamaroff I also thought so, but later I realized that there is no need to kill lower values, but it suffices to show that the linear coefficient at the highest value (assumed to be non-zero), should be zero. – Alex Ravsky Dec 26 '14 at 13:48
-
OK,I've got it, now I want to prove the multiquadric radius basis function $\phi(r) = \sqrt{1 + (\varepsilon (x-c_i))^2} $ are linear independent,$c_1\le c_2\le...\le c_N,1\le i \le N$ how to do? – yang Dec 26 '14 at 13:55
-
@yang Just now I don't see a simple solution, so you may create an other question for it. – Alex Ravsky Dec 26 '14 at 14:05
Both Gaussian and multiquadric functions are (strictly) positive definite (Wendland's "Scattered data approximation", pp.74,76). If there were a non-trivial linear combination identically zero for all $x$ then it would be zero for any $x=x_j$ too and you could write a quadratic form $\sum\sum\alpha_j\bar{\alpha_k}\Phi(x_j-x_k)=0$ which would contradict the strict positive definiteness of $\Phi$.

- 4,205