I would like to show the theorem given below:
Let $V$ be a vector space with $\dim(V)=n$, let $U$ be a real subspace of $V$ (i.e. $U\subset V$ and $U\neq 0$) and let $v\in V\setminus U$. Then there exists a $f\in V^*$ s.t. $f(u)=0$ for all $u\in U$ and $f(v)=1$.
I started with taking a basis $B_u=(u_1,\cdots, u_k), \,(k<n)$ for $U$, which can be extended to a basis $B_v=(u_1,\cdots,u_k,u_{k+1},\cdots,u_n)$ for $V$. Then there exists a dual basis $B^*=(v_1^*,\cdots,v_n^*)$ s.t. $v_i^*(u_j)=\delta_{i,j}$.
Using these definitions I started to compute $f(u)$ for an arbitrary $u\in U$:
$f(u)=f(\sum\limits_{i=1}^{k}\lambda_i u_i)=\sum\limits_{i=1}^k\lambda_i f(u_i)=\sum\limits_{i=1}^k \lambda_i (\sum\limits_{j=1}^nf(u_j)v_j^*(u_i))$
At this point I am struggling to find the next steps. It kind of looks like the result can be seen by considering the $v_j^*(u_i)$ parts because they either evaluate to $1$ or $0$, but I am not sure how to argue that this term is $0$ and, if I consider $v$, i.e. extend the outer sum for $v=\sum\limits_{i=1}^n \mu_i u_i$, that the term is $1$.
I would be very grateful, if someone can help me.