1

Let $V$ be a vector space and $f,g_1,g_2,\ldots,g_m\in V^*$ its dual, prove that $f$ is a linear combination of $g_i$'s iff $\cap \text{ker} g_i \subseteq \text{ker}f$

I already understand that if the intersection of $g_i$'s is empty, then it is a basis for the dual, and therefore $f$ is a linear combination of $g_i$'s. And I also managed to prove the foward direction (if $f$ is a linear combination of $g_i$, then $\cap \text{ker} g_i \subseteq \text{ker} f$, since that for any $v \in \cap ker g_i$ $f(v)=0=a_1g_1(v)=g_2(v)=\ldots=g_m(v)=0$). But I'm not finding a way to prove the reverse direction. Tried chatgpt but got only nonsense. Can someone give me a hint or post a duplicate (really didn't find it out here).

Thanks in advance.

TShiong
  • 1,257

1 Answers1

1

In a comment it says we can assume $V$ is finite dimensional, so choose a basis $v_1, ... v_n$ of $V$, and $v_1^*, ... v_n^*$ is the dual basis. Now, both $V$ and $V^*$ are isomorphic to $F^n$, and the evaluation map $V^* \times V \rightarrow F$ is the dot product.

We now have a system of linear equations $f = \sum_{i=1}^m a_i g_i$ with $n$ equations and $m$ variables. Gaussian elimination either gives a solution to this equation (which shows that $f$ is a linear combination of the $g_i$), or outputs some vector whose dot product with $f$ is nonzero, but whose dot product with $g_i$ are all zero. This is an element in $\cap ker(g_i)$ but not in $ker(f)$.

David Lui
  • 6,295
  • Got everything less the last part. I dont understand how if the system doesnt have a solution it outputs some vector (whose coordinates are $a_i$ )whose dot product with f is non zero but zero with all $g_i$. I will think more a bit. – Victor Hugo Apr 10 '23 at 12:26
  • Got it thanks! Will comment the details later. – Victor Hugo Apr 10 '23 at 13:03
  • 1
    Gaussian elimination does row operations to try to find a solution. If there is no solution, then Gaussian elimination produces a list of row operations that when applied to the matrix, results in zero, but when applied to the solution vector, is not zero. However, row operations corresponds to left multiplication by a row vector, and left multiplying by a row vector means taking the dot product of that vector and the columns of the matrix. – David Lui Apr 10 '23 at 15:11
  • Better look for duplicates (see above) than duplicate answers. And actually, finite dimensionality is dispensable. – Anne Bauval Apr 11 '23 at 11:35