1

Let $a < b$ be real numbers and $f_1,...,f_n:[a,b] \to \mathbb R$ continuous funtions. Define an $n$ by $n$ matrix $M = (m_{ij})_{i,j = 1,...,n}$ where $$m_{ij} = \int_a^bf_i(x)f_j(x)dx. $$ Prove that $$\det(M) = 0 \iff f_1,...,f_n \text{ are linearly dependent}.$$

Since the matrix $M$ is symmetric, we know that it is orthogonally diagonalizable. If $M$ is a diagonal matrix, then one of $m_{ii}$ is zero, which means that (since $f_i^2$ is nonnegative and continuous) one of $f_i$ is zero, so that $f_i$'s are linearly dependent. Conversely, if one of the $f_i$ can be expressed as a linear combination of the other, say $f_n = \sum_1^{n-1}c_kf_k$ then we may use the fact that $M$ is a diagonal to deduce algebraically that the determinant is zero.

However, I am not sure how to get from this special case to the general case. By looking at the problem when $n = 2$ it seems like Cauchy-Schwartz inequality may come up somewhere, but I am not sure how it would exactly.

I would like to know how I can solve this problem and also about what would be the motivation for someone to come up with this problem.

EDIT: Thank you for the link given by @AnneBauval. However, I found the answers on the link to be insufficient for my understanding. When we write $G = A^TA$, are we not assuming that we are simply doing the standard inner product as we compute the product $A^TA$ entry by entry? In our case of the matrix $M$ defined as above, how would a matrix that satisfy $G = A^TA$ look like? The link from @AnneBauval gave me greater sense of the problem though, and I thank her for that.

  • 6
    Does this answer your question? Gram matrix invertible iff set of vectors linearly independent (considering the natural inner product on $C[a,b]$) – Anne Bauval Nov 13 '22 at 09:22
  • @AnneBauval Thank you for your link. Although it did give me many insights, I am still unsure how we can define an inner product that is not the standard dot product and then proceed to let $G = A^TA$. By writing $G = A^TA$, are we automatically not assuming that we are simply doing standard dot product as we calculate $A^TA$ entry by entry? If you could kindly provide some insight into this that'd be great. – Squirrel-Power Nov 13 '22 at 18:57
  • 1
    @CuteBrownie The result holds for any inner product. Note computing an inner product relative to an orthonormal basis for the product reduces to ordinary dot product of the coordinates. – blargoner Nov 13 '22 at 19:20

2 Answers2

3

$M$ is the Gram matrix of $(f_1,\dots,f_n),$ i.e. $m_{i,j}=f_i\cdot f_j,$ for the inner product on $C([a,b])$ defined by $$f\cdot g=\int_a^bf(x)g(x)\,\mathrm dx.$$ Let us now forget about $C([a,b]),$ and prove that for any family $(f_1,\dots,f_n)$ of vectors in any inner product space, its Gram matrix $M\in M_n(\Bbb R)$ is singular iff the $n$ vectors are linearly dependent.

$\Rightarrow:$ if $M$ is singular, let $v=\begin{pmatrix}a_1\\\vdots\\a_n\end{pmatrix}\in M_{n,1}(\Bbb R)$ be a non-zero vector of its kernel, i.e. $Mv=0.$ Then, $$0=v^TMv=\sum_{i,j}a_im_{i,j}a_j=\sum_{i,j}a_i(f_i\cdot f_j)a_j=w\cdot w$$ where $$w:=\sum_ia_if_i.$$ From $\|w\|^2=0,$ we deduce that $w=0,$ which (since $v\ne0$) proves that $f_1,\dots,f_n$ are linearly dependent.

$\Leftarrow:$ if $f_1,\dots,f_n$ are linearly dependent, then $\sum_ja_jf_j=0$ for some non-zero $v=\begin{pmatrix}a_1\\\vdots\\a_n\end{pmatrix}\in M_{n,1}(\Bbb R).$ Then, $f_i\cdot\sum_ja_jf_j=0$ for all $i,$ i.e. $Mv=0,$ which (since $v\ne0$) proves that $M$ is singular.

Anne Bauval
  • 34,650
0

Consider an inner product space and linearly independent vectors $v_1,v_2,\ldots,v_n.$ We will show that the determinant of the Gram matrix is nonzero by induction with respect to $n.$ Assume it holds for $n-1.$ Define the vector $$w = \begin{vmatrix}\langle v_1,v_1\rangle & \langle v_1,v_2\rangle &\ldots &\langle v_1,v_n\rangle\\ \langle v_2,v_1\rangle & \langle v_2,v_2\rangle &\ldots &\langle v_2,v_n\rangle\\ \vdots & \vdots&&\vdots\\ \langle v_{n-1},v_1\rangle & \langle v_{n-1},v_2\rangle &\ldots &\langle v_{n-1},v_n\rangle\\ v_1& v_2 & \ldots& v_n \end{vmatrix}$$ This vector is a linear combinations of the vectors $v_1,v_2,\ldots,v_n.$ Observe that $\langle v_k,w\rangle =0$ for $k=1,2,\ldots,{n-1}$ as we obtain a determinant where two rows are equal. The Gram determinant is equal $\langle v_n,w\rangle.$ If it vanishes then $w $ is orthogonal to all the vectors $v_1,v_2,\ldots,v_n.$ Hence $w$ is orthogonal to itself, i.e. $w=0.$ This means that the nontrivial linear combination of the vectors $v_2,v_2,\ldots,v_n$ is equal $0,$ a contradiction. Nontriviality of the combination follows from the fact that the coefficient of $v_n$ in this linear combination is equal to the determinant of the Gram matrix of the first $n-1$ vectors, which is nonzero by the induction hypothesis.

Remark In case the vectors $v_1,v_2,\ldots,v_n$ are linearly independent this method provides explicit construction of the orthogonal vectors showing up in the Gram-Schmidt procedure.