1

Let f be a function that has a finite limit at infinity. It is true that this alone is not enough to show that its derivative converges to zero at infinity. So I was wondering weather there were any additional conditions for f that could give the desired outcome. I am also aware of Barbalat's Lemma but this requires uniform continuity, a property which in many occasions is not easy to verify. Thank you

Nick
  • 63

2 Answers2

1

Without loss of generality, we have $f(x) \to 0$ as $x \to \infty$.

  1. If $\lim_{x \to \infty} f'$ exists, then it's clear that $f' \to 0$.
  2. If $f''$ is bounded, then $f' \to 0$. Morally, this is because $f$ must oscillate more and more tightly as $x \to \infty$ in order for $f' \not \to 0$, and $f''$ being bounded prevents that oscillation.
1

In addition to the points covered in mixedmath's answer we can add the hypothesis that $f'$ is monotone. Monotonicity implies that $f'(x)$ either tends to $\infty$ or to $-\infty$ or to a limit $L$ as $x \to \infty$. Clearly since $f(x + 1) - f(x) = f'(c) \to 0$ with $x < x < x + 1$ it follows that the options for $f'$ to tend to $\pm\infty$ is not possible. Hence $f'(x) \to 0$ as $x \to \infty$.

The monotonicity of $f'$ can be guaranteed by assuming that $f''$ is of constant sign for all $x$ after a certain value.