3

I have a function $f:\mathbb{R}\rightarrow\mathbb{R}$ with $f''(x)>0\text{ }\forall x\in\mathbb{R}$ and $\lim_{x\to+\infty}{f(x)}=0$. Now I want to prove that $f$ is strictly decreasing. I tried proving this with reductio ad absurdum but I stack somewhere. Any ideas?

Leos Kotrop
  • 1,195

1 Answers1

1

We want to prove that $f'(x)\leq 0$ for all $x\in \mathbb R$.

Fix $x\in \mathbb R$. By Taylor formula we have

$$f(x+h) = f(x)+f'(x)h + h^2\int_0^1(1-t)f''(x+th)dt \geq f(x)+f'(x)h,$$ because $f''(x+th)> 0$.

Suppose $f'(x)>0$. We have

$$f(x+h) \geq f(x)+f'(x)h,$$

and therefore $\lim_{h\to \infty} f(x+h) = \infty$, which is a contradiction.

Therefore, $f(x)\leq 0$.

Hugo
  • 3,775