3

Suppose $f:(a,b)\rightarrow \mathbb{R}$ is continuous, and that $$\lim_{h \rightarrow 0} \frac{f(x+h)-f(x-h)}{2h}$$ Exists for all $x\in(a,b)$ and is strictly greater than zero.

How can I show that $f$ must be strictly increasing? I am comfortable with the usual MVT proof that a positive derivative implies a function is increasing, however, in this case, I cannot assume that the function is differentiable - so can't use the MVT in its standard form at least.

I've had two ideas:

  1. Go right back to foundations and show that the MVT still holds in this case - by showing that Rolle's theorem still holds (does it?).

  2. Use the definition of a limit directly with an $\epsilon - \delta$ argument, showing that for all $|h|<\delta$, $\frac{f(x+h)-f(x-h)}{2h}>0$ so somehow the function must be increasing.

Any thoughts?

1 Answers1

2

You can first show that the function $f$ is "locally increasing": for every $x\in(a,b)$, the assumption implies that $$ f(x+h)-f(x-h)>0 $$ for small $h>0$.

Now consider any compact interval $[c,d]\subset (a,b)$. Then $[c,d]$ can be covered by finitely many open intervals where $f$ is increasing. Since $[c,d]$ is arbitrary, $f$ must be increasing on $(a,b)$.