3

If $f:\mathbb{R}\to\mathbb{R}$ is a differentiable function with $xf'(x)\to 0$ as $x\to\infty,$ then it is not true that $f(x)\to c\in\mathbb{R}$ as $x\to\infty. $ For example, take $f(x) = \log(\log(x)).$ But I cannot figure out the following:

If $f:\mathbb{R}\to\mathbb{R}$ is a differentiable function with $x^2f'(x)\to 0$ as $x\to\infty,$ then is it true that $f(x)\to c\in\mathbb{R}$ as $x\to\infty $?

I'm not certain it is true but cannot think of a counter-example either. I have tried three different methods, but got nowhere:

  1. Integration by parts:

$$\int_{a}^{\infty} x^2 f'(x) dx = \left[ x^2 f(x) \right]_{a}^{\infty} - \int_{a}^{\infty} 2x f(x) dx $$

but I don't see where to go from here. In fact, I'm pretty sure this is the wrong route.

2.

$$ x^2 f'(x)\to 0 \implies \lim_{x\to \infty}\left( x^2 \lim_{h\to 0} \frac{f(x+h)-f(x)}{h} \right)$$

I'm not sure how we can manipulate this to help us.

  1. Given $\varepsilon>0,\ \exists\ \gamma\ $ s.t. $\ \vert x^2 f'(x)\vert < \varepsilon\ \forall x>\gamma, \implies \vert f'(x) \vert <\frac{\varepsilon}{\gamma^2}\ \forall x> \gamma.$

But now what?

Adam Rubinson
  • 20,052

2 Answers2

4

EDIT As pointed out by Martin R, the old answer applies only when $f’$ is continuous. The MSE question they linked to is sufficient to address the issue, but I wanted to provide a self-contained argument (which, amusingly, resembles their solution but is imo somewhat simpler).

Let $R>1$ be as in the old answer. Then, on $(R,\infty)$, $a(x)=1/x+f(x)$ and $b(x)=1/x-f(x)$ are differentiable functions with negative derivative, so they are decreasing. Thus they have limits (at $\infty$) $\alpha,\beta \in [-\infty,\infty)$. But $(a+b)(x)=-2/x \rightarrow 0$, so that $\alpha,\beta$ are finite and have zero sum. But $f(x)=\frac{a-b}{2}$ so that $f(x) \rightarrow \alpha$.


Old answer:

Let $R>1$ be such that $x^2|f’(x)| \leq 1$ for $x > R$. Then $\int_R^{\infty}{|f’(x)|\,dx} \leq \int_R^{\infty}{x^{-2}\,dx} <\infty$. Therefore, the integral $F(x)=\int_R^x{f’(t)\,dt}$ converges as $x \rightarrow \infty$. But since $F(x)=f(x)-f(R)$, $f$ has indeed a limit at $\infty$.

Martin R
  • 113,040
Aphelli
  • 34,439
3

Essentially the same idea as in Mindlack's answer, but avoiding integration and the fundamental theorem of calculus.

Given $\epsilon > 0$ there is an $x_0 > 1$ such that $x^2|f'(x)| < \epsilon$ for all $x \ge x_0$. For $x > x_0$ is $$ \left( f(x) + \frac\epsilon x\right)' = f'(x) - \frac{\epsilon}{x^2} < 0 $$ which implies that for $x_0 \le a < b$ $$ f(a) + \frac\epsilon a > f(b) + \frac\epsilon b \implies f(b) - f(a) < \frac\epsilon a - \frac\epsilon b \le \epsilon \, . $$ In the same way one proves $f(b) - f(a) > -\epsilon$, so that $$ x_0 \le a < b \implies |f(a) - f(b)| < \epsilon \, . $$

So $f(x_n)$ is a Cauchy-sequence (and therefore convergent) for all sequences $(x_n)$ converging to $\infty$, and that implies the existence of the limit $\lim_{x \to \infty} f(x)$.

Martin R
  • 113,040