We have "$f'(x)>0$ in interval $I\implies f(x)$ strictly increasing in interval $I$".
However, the converse is not true.
A frequently cited example is the function $f(x)=x^3$, which is strictly increasing but $f'(0)=0$.
Here comes my question:
What is the necessary and sufficient condition of $f'(x)$ to make $f(x)$ strictly increasing?
My guess is:
$f'(x)>0$ almost everywhere.
This is clearly a sufficient condition, as we can use Lebesgue integration and "drop the word 'almost'".
But is this a necessary condition? Is there a weaker condition? A full solution or a little hint will both be appreciated. Thank you in advance.