1

When all the vectors are chosen from $R^n$, it is clear to prove Gram matrix invertible if and only if the set of vectors is linearly independent as shown below.

Gram matrix invertible iff set of vectors linearly independent

I think the trick is that $G$ can be written as $A^TA$, where $A$ is $(\mu_1,\cdots,\mu_n)$, $\mu_i$ is k dimensional column vector. But if the set of vector is not belong to $R^n$, how can I prove it.

Here is something I found, but can not clearly understand. $G_{ij}=(\mu_i,\mu_j)$, $\mu_i$ belongs to a inner product space $X$.

$G$ is invertible iff only $0$ is the solution to

$(\sum_{j=1}^{n} a_j \mu_j,\mu_k)=\sum_{j=1}^{n}(\mu_j,\mu_k)a_j=0$

$\sum_{j=1}^{n} a_j \mu_j=0$

$\iff (\sum_{j=1}^{n} a_j \mu_j,\sum_{j=1}^{n} a_j \mu_j)=0$

$\iff (\sum_{j=1}^{n} a_j \mu_j,\mu_k)=\sum_{j=1}^{n}(\mu_j,\mu_k)a_j=0$

What does the proof mean? How can I get my questions solved. Appreciate for your helping hands.

  • Any finite dimensional vector space over $\mathbb R$ is isomorphic to $\mathbb R^\ell$ for $\ell\in\mathbb N$. Use this to identify $G$ with an appropriate matrix in $\mathbb R^{k\times k}$. – L. t. Nov 30 '21 at 12:34
  • Sorry, I can not follow, You mean we can always find $A$ such that $G=A^TA$? – Andrew Ren Nov 30 '21 at 12:37
  • We only have to look at $Y=\mathrm{span}{\mu_1,\dots,\mu_n}$ which is a finite dimensional subspace of $(X,\langle\cdot,\cdot\rangle_X)$. Define an isomorphism $\phi:Y\to\mathbb R^n$ by continuing $Y\mu_i = e_i$ where $e_i=(0,0,...,1,0,...,0)$ is the $i$-th unit vector. Then you can define an inner product $\langle a,b\rangle_{\mathbb R^n} := \langle\phi^{-1}(a),\phi^{-1}(b)\rangle_X$ on $\mathbb R^n$. Now write it as $\langle a,b\rangle_{\mathbb R^n} = aBb$ and apply the answer of this question https://math.stackexchange.com/a/3192399/750710 to obtain an inner product space isomorphism. – L. t. Nov 30 '21 at 12:51
  • 1
    What I mean with the above is that you can carry out the proof you linked in $\mathbb R^n$ by identifying $\mu_i$ with $\psi(\mu_i)$ where $\psi$ is the inner product space isomorphism and then use the properties of $\psi$ to conclude that it also holds for $X$. – L. t. Nov 30 '21 at 13:18

1 Answers1

1

I think this question is very elementry in linear algebra, and there are lots of viewpoints to understand this. The "only if" part is trivial, so I only show the "if" part.


First and to me the most standard way is SVD. Firstly, since the columns of $A$ are independent, $A$ is m-by-n matrix with $m\ge n$. Write $A=U(\begin{array}{c}\Sigma\\0\end{array})V^T$ as the SVD of $A$, where $\Sigma=\text{diag}(\sigma_1, \ldots, \sigma_n)$ with $\sigma_i\neq 0$ for all $i$, and $U$ and $V$ are orthogonal matrices. Then $$A^T A = V\Sigma^2V^T$$ is clearly nonsingular.


Second, we can think in terms of linear mapping. Let $V=span(v_1, \ldots, v_n)$ by any subspace of $\mathbb{R}^n$ with unit-length basis $v_1,\ldots, v_n$. We need to show that if $x\in V,x\neq 0$, then there exists $v_i$ such that $x\cdot v_i \neq 0$. Indeed, $$x = (x\cdot v_1) v_1 + \cdots (x\cdot v_n) v_n$$ $x\neq 0$, there exists $x\cdot v_i$ that is not zero.

Now Let $A=(a_1, \cdots, a_n)$. Suppose $x\in Range(A)$, $A^Tx$ is the coordinate of $x$ under the basis $a_1,\ldots, a_n$. Therefore, the mapping $f:Range(A) \rightarrow \mathbb{R}^m, f(x)=A^Tx$ is injective. Now let $g:\mathbb{R}^m\rightarrow Range(A), g(x) = Ax$, since $a_1,\ldots, a_n$ are independent, $g$ is injective. Therefore, the composite $h=f\circ g:\mathbb{R}^m\rightarrow \mathbb{R}^m, h(x) = A^TAx$ is injective. Since $m=m$, $h$ is also surjective. Hence $h$ is invertible, and so is the matrix $A^TA$.