My teacher proposed this exercise:
Let $f$ be a differentiable function on $(a, b)$ except in $x_1, x_2 \in (a, b)$. Suppose that $f$ is continuous on $[a, b]$. Prove the next statement: if $f'(x) > 0$ for all points of $(a, b)$ where $f$ is differentiable then $f$ is increasing on $[a, b]$. Is the converse true?
My attempt:
I applied the MVT for intervals $[a, x_1), (x_1, x_2), (x_2, b]$ taking two arbitrary numbers in each interval, checking the conditions for MVT and concluding that $f$ is increasing in each interval. But how I conclude that $f$ is increasing on $[a, b]$? I think I have to prove that $f(x) \le f(y)$ for every $x \in [a, x_1)$ and $y \in (x_1, x_2)$, the same way for the other interval, but I am not sure how to do it.
As the exercise is proposed, I consider the converse is not true. (I add the definition of increasing function given by the teacher. $f$ is increasing in $I$ if: $x \lt y \implies f(x) \lt f(y)$). So, taking $f(x) = k \in \Bbb R$ we have a increasing function but $f'(x) = 0$ for all $x \in \Bbb R$. I am reasoning in the right way?
I would really appreciate your help. And if anyone can provide me another example of the converse, or, off course, correct me.