2

I've wondering about the generalization of this, in my opinion, beautiful theorem, and I tried to write it but I don't know if it's correct.

Theorem:

Let $f_n:(a,b) \rightarrow \mathbb R$ differentiable functions such that:

1) There exist $g:(a,b) \rightarrow \mathbb R$ such that $f'_n$ $ \rightarrow$ $g$ uniformly over (a,b).

2)There exist $x_0 \in(a,b)$ such that the sequence $\{f_n(x_0)\}$ converges.

Then there exist $f_n:(a,b) \rightarrow \mathbb R$ such that $f_n \rightarrow f$ uniformly over $(a,b)$, $f$ is differentiable and $f'=g$.

Now my attempt of the generalization is the following:

Let $A\subset \mathbb R^n$, open, convex, bounded and $f_n:A \rightarrow \mathbb R^m$ differentiable.

1)There exist h: $\mathbb R^n\rightarrow \mathbb R^m$ such that $T_n \rightarrow H$ where $T_n$ is the differential of $f_n$.

2)There exist $x_0 \in $ A such that the sequence $\{f_n(x_0)\}$ converges.

Then exist $f:A \rightarrow \mathbb R^m$ such that $f_n \rightarrow f$ uniformly over A, $f$ is differentiable and $T_n$=h.

My questions are: Is my generalization correct? is it wrong?

If it's correct, could have been written in a more elegant way?

Also which would be the the proof (for the generalization theorem)?

:)

  • 1
    I'm confused. What algebra are you referring to? Wouldn't you rather want to simply take $x_0\in A$? Besides a rather large number of typos and the algebra thing it sounds fine. – tomasz Apr 28 '17 at 01:26
  • 3
    I would change 1) in your generalization to: 1)There exists g: $\mathbb R^n\rightarrow L(\mathbb R^n,\mathbb R^m)$ such that $Df_n(x) \rightarrow g$ uniformly on $A.$ Here $L(\mathbb R^n,\mathbb R^m)$ is the normed vector space of linear transformations from $\mathbb R^n$ to $\mathbb R^m,$ with, say, the operator norm.

    The conclusion would then be: Then there exists $f:A \rightarrow \mathbb R^m$ such that $f_n \rightarrow f$ uniformly on $A,$ where $f$ is differentiable and $Df(x) = g(x)$ for all $x\in A.$

    – zhw. Apr 30 '17 at 03:31
  • why would you change 1)? I mean is it more convenient or more general ? @zhw. – Aaron Martinez Apr 30 '17 at 03:37

1 Answers1

3

\begin{align*} & \frac{f(x)-f(x_{0})-H(x_{0})\cdot(x-x_{0})}{\Vert x-x_{0\Vert}}% =\frac{f(x)-f(x_{0})-[f_{n}(x)-f_{n}(x_{0})]}{\Vert x-x_{0\Vert}}\\ & +\frac{f_{n}(x)-f_{n}(x_{0})-\nabla f_{n}(x_{0})\cdot(x-x_{0})}{\Vert x-x_{0\Vert}}+\frac{(\nabla f_{n}(x_{0})-H(x_{0}))\cdot(x-x_{0})}{\Vert x-x_{0\Vert}}\\ & =:I+II+III. \end{align*} Since $A$ is convex, by applying the mean value theorem to the function $$ g_{n,m}(t)=f_{m}(tx+(1-t)x_{0})-f_{n}(tx+(1-t)x_{0}),\quad t\in\lbrack0,1] $$ there is $t_{0}$ such that \begin{align*} & f_{m}(x)-f_{m}(x_{0})-[f_{n}(x)-f_{n}(x_{0})]=g_{n,m}(1)-g_{n,m}(0)\\ & =g_{n,m}^{\prime}(t_{0})=(\nabla f_{m}(z_{0})-\nabla f_{n}(z_{0}% ))\cdot(x-x_{0}), \end{align*} where $z_{0}=t_{0}x+(1-t_{0})x_{0}$. By uniform convergence of the gradients, $$ \Vert\nabla f_{m}(z)-\nabla f_{n}(z)\Vert\leq\Vert\nabla f_{m}(z)-H(z)\Vert +\Vert\nabla f_{n}(z)-H(z)\Vert\leq2\varepsilon $$ for all $n,m\geq n_{\varepsilon}$ and all $z\in A$. Hence, by Cauchy's inequality \begin{align*} \left\vert \frac{f_{m}(x)-f_{m}(x_{0})-[f_{n}(x)-f_{n}(x_{0})]}{\Vert x-x_{0\Vert}}\right\vert & =\left\vert \frac{(\nabla f_{m}(z_{0})-\nabla f_{n}(z_{0}))\cdot(x-x_{0})}{\Vert x-x_{0\Vert}}\right\vert \\ & \leq\Vert\nabla f_{m}(z_{0})-\nabla f_{n}(z_{0})\Vert\leq2\varepsilon. \end{align*} Since $A$ is bounded, this inequality implies that \begin{align*} \vert f_{m}(x)-f_{n}(x)\vert\le|f_{m}(x_{0})-f_{n}(x_{0})|+\Vert x-x_{0}\Vert 2\varepsilon\le |f_{m}(x_{0})-f_{n}(x_{0})|+2M\varepsilon. \end{align*} and so $\{f_n\}$ is a uniform Cauchy sequence and so it converges uniformly to a function $f$. Letting $m\rightarrow\infty$ we get $$ \left\vert \frac{f(x)-f(x_{0})-[f_{n}(x)-f_{n}(x_{0})]}{\Vert x-x_{0\Vert}% }\right\vert \leq2\varepsilon $$ for all $n\geq n_{\varepsilon}$. This takes care of $I$. Taking $n=n_{\varepsilon}$ and using the fact that $f_{n_{\varepsilon}}$ is differentiable at $x_{0}$ we get that $$ \left\vert \frac{f_{n_{\varepsilon}}(x)-f_{n_{\varepsilon}}(x_{0})-\nabla f_{n_{\varepsilon}}(x_{0})\cdot(x-x_{0})}{\Vert x-x_{0\Vert}}\right\vert \leq\varepsilon $$ for all $x\in A$ with $0<\Vert x-x_{0}\Vert\leq\delta_{\varepsilon}$. This takes care of $II$.

Lastly, by Cauchy's inequality $$ \left\vert \frac{(\nabla f_{n}(x_{0})-H(x_{0}))\cdot(x-x_{0})}{\Vert x-x_{0\Vert}}\right\vert \leq\Vert\nabla f_{n}(x_{0})-H(x_{0})\Vert \leq\varepsilon $$ for all $n\geq n_{\varepsilon}$. In conclusion we have that for all $x\in A$ with $0<\Vert x-x_{0}\Vert\leq\delta_{\varepsilon}$, $$ \left\vert \frac{f(x)-f(x_{0})-H(x_{0})\cdot(x-x_{0})}{\Vert x-x_{0\Vert}% }\right\vert \leq4\varepsilon $$ which implies that $f$ is differentiable at $x_{0}$ with $\nabla f(x_{0})=H(x_{0})$. By repeating the proof with $x_0$ replaced by any other point, we get that $f$ is differentiable in $A$.

Gio67
  • 20,905
  • Wow I'd like to give you the +100 points now because seems lot of effort but first I need to understand it. – Aaron Martinez May 04 '17 at 13:43
  • The proof is very similar to the one dimensional case, – Gio67 May 04 '17 at 14:38
  • So in both cases it's hard to understand.. :( – Aaron Martinez May 04 '17 at 14:58
  • yeah, you should read the 1 dim case first. It's here https://math.stackexchange.com/questions/214218/uniform-convergence-of-derivatives-tao-14-2-7 – Gio67 May 04 '17 at 15:27
  • I'll compare it with my own proof (my professor did it) thanks for the link. – Aaron Martinez May 04 '17 at 15:34
  • Can you explain with more details the part when you say that ${f_n}$ is a uniform Cauchy sequence please?. I don't see it. I tried to understand it but I get the conclusion that $f_m(x)-f_n(x)$ is a Cauchy sequence.But I think that's not what we want to conclude. – Aaron Martinez May 18 '17 at 16:36
  • 1
    You have that for every $x\in A$, \begin{align} \vert f_{m}(x)-f_{n}(x)\vert\le |f_{m}(x_{0})-f_{n}(x_{0})|+2M\varepsilon. \end{align} and so $$\sup_{x\in A}\vert f_{m}(x)-f_{n}(x)\vert\le |f_{m}(x_{0})-f_{n}(x_{0})|+2M\varepsilon.$$ Since $f_{n}(x_{0})\to \ell$ as $n\to\infty$ you have that $\sup_{x\in A}\vert f_{m}(x)-f_{n}(x)\vert\le (1+2M)\varepsilon$ for all $n,m\ge k_\varepsilon$, which implies that the sequence ${f_n}$ is a Cauchy sequence in the space of bounded continuous functions. – Gio67 May 19 '17 at 11:39
  • I see, thank you for the explanation – Aaron Martinez May 19 '17 at 23:46
  • I was checking the proof again and I don't understand the part 'by uniform convergence of the gradients' So did you view the gradient as a function? How come? .Because by definition $ \nabla f(x,y,z)= ( \frac {\partial f}{\partial x},\frac {\partial f}{\partial y},\frac {\partial f}{\partial z})$ – Aaron Martinez May 21 '17 at 22:48
  • I think I know why now, because $\nabla f_m(z_0)= \frac {f_m(x)-f_m(x_0)}{||x-x_0||}$, am I correct? – Aaron Martinez May 21 '17 at 23:05
  • the gradient is a vector, it has $n$ components, but you can still talk about uniform convergence of vector-values functions. It just means $$\lim_{n\to\infty}\sup_{x\in E}\Vert g_n(x)-g(x)\Vert=0,$$ where you take the norm in $\mathbb{R}^n$. Otherwise you can just say that all the partial derivatives converge uniformly. – Gio67 May 22 '17 at 02:12
  • uniform convergence of the gradients means $$\lim_{n\to\infty}\sup_{x\in A}\Vert\nabla f_n(x)-\nabla f(x)\Vert=0.$$ Why do you find it so strange? We use it all the time. – Gio67 May 22 '17 at 02:48
  • Not me I haven't heard about it but It's nice to know it, thank you – Aaron Martinez May 22 '17 at 02:50
  • Uniform convergence of gradients it's a theorem? – Aaron Martinez May 22 '17 at 02:56
  • Instead of hypothesis 1) in your statement you need to assume that $\nabla u_n\to H$ uniformly. You need this hypothesis even in one dimension. Also in my answer I treated the case $m=1$, that if $f_n:A\to\mathbb{R}$. – Gio67 May 23 '17 at 01:37
  • Oh, that explains the u. converge of gradients. Haven't notice that I need u. converges as hypothesis – Aaron Martinez May 23 '17 at 01:43
  • I offered a 100+ bounty again for this proof :D haha, but this time I ask a more detailed explanation, I'm telling you just in the case you want to participate on it. – Aaron Martinez Jun 06 '17 at 22:00