1

I am trying to understand why $f'(x) \ge 0 \iff f \ \text{is monotonically increasing}$ with the usual set of assumptions. To do this I am trying to prove the two implications. It is relatively easy to get why $\impliedby$ holds since an increasing $f$ implies$${f(x_0+h)-f(x_0) \over h} \ge 0$$ no matter what the $h$ is.

The second implication $\implies$ proves to be more tricky. I am able to show it rewriting the mean value theorem as

$$f(b)=f'(\xi)(b-a)+f(a)$$ and concluding that for $a,b$ satisfying $a<b$ we do get $f(b)\ge f(a)$.

Is there an easier way to see $\implies$ without using the mean value theorem?

Zelazny
  • 2,489

1 Answers1

1

Yes, by the fundamental theorem of calculus. Let $f : \mathbb{R} \rightarrow \mathbb{R}$ be differentiable with $f'(x) \geq 0 \forall x \in \mathbb{R}$. Taking $x \in \mathbb{R}$ and $h \in \mathbb{R}_{\geq 0}$. \begin{align*} f(x+h) - f(x) = \int_{x}^{x+h} f'(t) dt \geq 0 \end{align*} Rearranging, $f(x + h) \geq f(x)$.

Edit: this assumes integrability of $f'(x)$.