9

Let $f(x)$ be twice differentiable on $(0,\infty)$ and let $\lim_{x\to \infty} f(x) = L<\infty$ and $|f''(x)| \le M$ for some $M>0$.

Prove that $\lim_{x \to \infty} f'(x) = 0$.

I've tried to use taylor with remainder in a lot of different ways and am still unable to crack this one.

  • 2
    @Bryan how is that a duplicate? The hypothesis are fundamentally different, aren't they? – Ant Jun 27 '14 at 20:17
  • If you look the amazing solution by @Bill Dubuque you don't need any hypothesis on $f''.$ – mfl Jun 27 '14 at 20:18
  • Nor the mentioned answer. – mfl Jun 27 '14 at 20:36
  • @mfl The mentioned answer uses the assumption that $f'$ has a limit. –  Jun 27 '14 at 20:38
  • To the Op, you may not even be able to give a Taylor expansion (in general). – Squirtle Jun 27 '14 at 21:00
  • @mfl: I saw related question and Bill's answer there. But that is completely different from the question posted here. In Bill's answer there is the assumption that $f(x) + f'(x) \to L$. Here the assumptions are: $f(x) \to L$ and $f''(x)$ is bounded. And your discussion is about "whether this bounded nature of $f''(x)$ is needed or not". Obviously it is needed. Just with $f(x) \to L$ you can't conclude that $f'(x) \to 0$. Note that the condition $f(x) + f'(x) \to L$ is much stronger than $f(x) \to L$. – Paramanand Singh Jun 28 '14 at 04:23

3 Answers3

3

First note that $\int_0^\infty f'(x)dx = L-f(0)$ since $L$ is the limit of $f(x)$ as $x\to\infty$. Note also that $f'(x)$ is continuous, moreover it is uniformly continuous by the mean value theorem. Specifically we have $$|f'(x) - f'(y)| = |f''(\xi)| |x-y| \le M|x-y|$$ where $\xi \in [x,y]$.

From here we may use the answer to this problem to complete the proof.

Joel
  • 16,256
3

Let $x\in(0,+\infty)$ and $h>0$ such that $x-h>0$. By Taylor–Lagrange Formula, there exists $c\in(x,x+h)$ and $c'\in(x-h,x)$ such that $$f(x+h)=f(x)+hf'(x)+\frac{h^2}2f''(c)$$ and $$f(x-h)=f(x)-hf'(x)+\frac{h^2}2f''(c'),$$ and subtracting both equalities yields $$f(x+h)-f(x-h)=2hf'(x)+\frac{h^2}2\bigl(f''(c)-f''(c')\bigr),$$ i.e., $$f'(x)=\frac{f(x+h)-f(x-h)}{2h}-\frac{h}4\bigl(f''(c)-f''(c')\bigr),$$ hence $$\bigl\lvert f'(x)\bigr\rvert\leq\left\lvert\frac{f(x+h)-f(x-h)}{2h}\right\rvert+\frac{hM}2.$$ Since $\lim\limits_{x\to+\infty}f(x)=L$, there exists $A>0$ such that: $$\forall t>A,\ L-h^2<f(t)<L+h^2.$$ So if $x>A+h=A'$ (in order to have $c>A$ and $c'>A$), $$\left\lvert\frac{f''(c)-f''(c')}{2h}\right\rvert\leq h$$ and hence $$\bigl\lvert f'(x)\bigr\rvert\leq h+\frac{hM}2=h\left(1+\frac M2\right).$$ Conclusion: for all $h>0$ there exists $A'>0$ such that $$\forall x>A',\ \bigl\lvert f'(x)\bigr\rvert\leq h\left(1+\frac M2\right).$$ Hence $\lim\limits_{x\to+\infty}f'(x)=0$.


Note: the Kolmogorov inequality $$M_1\leq\sqrt{2M_0M_2}$$ where $M_k=\bigl\lVert f^{(k)}\bigr\rVert_\infty$ can be proved in a very similar way.

gniourf_gniourf
  • 4,196
  • 18
  • 22
1

Suppose that there is a sequence $x_{n}\rightarrow\infty$ such that $\left|f'(x_{n})\right|>\epsilon$, for some $\epsilon>0$. Passing to a subsequence if necessary, we may assume that $f'(x_{n})>\epsilon$ and $f''(x_{n})\geq -M$ for all $n$. Taking $M$ larger if necessary, we may assume that $M>\epsilon$. Let $c>0$ be such that $x\geq c\Rightarrow \left|f(x)-L\right|<\delta$. Deleting finitely many terms if necessary, we may assume that $x_{n}\geq c$ for all $n$.

Then $f'(t)\geq\epsilon/2$ for all $t\in (x_{n}-\frac{\epsilon}{4M},x_{n}+\frac{\epsilon}{4M})$. Hence,

$$f\left(x_{n}+\dfrac{\epsilon}{4M}\right)\geq L-\delta+\dfrac{\epsilon}{2}\cdot\dfrac{\epsilon}{4M}$$

If we choose $\delta<\epsilon^{2}/16M$, then

$$f\left(x_{n}+\dfrac{\epsilon}{4M}\right)\geq L+\dfrac{\epsilon^{2}}{16M}>L+\delta$$

a contradiction.