When all the vectors are chosen from $R^n$, it is clear to prove Gram matrix invertible if and only if the set of vectors is linearly independent as shown below.
Gram matrix invertible iff set of vectors linearly independent
I think the trick is that $G$ can be written as $A^TA$, where $A$ is $(\mu_1,\cdots,\mu_n)$, $\mu_i$ is k dimensional column vector. But if the set of vector is not belong to $R^n$, how can I prove it.
Here is something I found, but can not clearly understand. $G_{ij}=(\mu_i,\mu_j)$, $\mu_i$ belongs to a inner product space $X$.
$G$ is invertible iff only $0$ is the solution to
$(\sum_{j=1}^{n} a_j \mu_j,\mu_k)=\sum_{j=1}^{n}(\mu_j,\mu_k)a_j=0$
$\sum_{j=1}^{n} a_j \mu_j=0$
$\iff (\sum_{j=1}^{n} a_j \mu_j,\sum_{j=1}^{n} a_j \mu_j)=0$
$\iff (\sum_{j=1}^{n} a_j \mu_j,\mu_k)=\sum_{j=1}^{n}(\mu_j,\mu_k)a_j=0$
What does the proof mean? How can I get my questions solved. Appreciate for your helping hands.