Let f be a function that has a finite limit at infinity. It is true that this alone is not enough to show that its derivative converges to zero at infinity. So I was wondering weather there were any additional conditions for f that could give the desired outcome. I am also aware of Barbalat's Lemma but this requires uniform continuity, a property which in many occasions is not easy to verify. Thank you
Asked
Active
Viewed 886 times
2 Answers
1
Without loss of generality, we have $f(x) \to 0$ as $x \to \infty$.
- If $\lim_{x \to \infty} f'$ exists, then it's clear that $f' \to 0$.
- If $f''$ is bounded, then $f' \to 0$. Morally, this is because $f$ must oscillate more and more tightly as $x \to \infty$ in order for $f' \not \to 0$, and $f''$ being bounded prevents that oscillation.

davidlowryduda
- 91,687
-
The second part is covered here http://math.stackexchange.com/q/730411/72031 And +1 for adding this point. – Paramanand Singh Mar 23 '16 at 04:07
1
In addition to the points covered in mixedmath's answer we can add the hypothesis that $f'$ is monotone. Monotonicity implies that $f'(x)$ either tends to $\infty$ or to $-\infty$ or to a limit $L$ as $x \to \infty$. Clearly since $f(x + 1) - f(x) = f'(c) \to 0$ with $x < x < x + 1$ it follows that the options for $f'$ to tend to $\pm\infty$ is not possible. Hence $f'(x) \to 0$ as $x \to \infty$.
The monotonicity of $f'$ can be guaranteed by assuming that $f''$ is of constant sign for all $x$ after a certain value.

Paramanand Singh
- 87,309