I've been grading the following problem for an undergraduate numerical analysis course:
Suppose $f \in C([a,b])$ and $f'(x)$ exists on $(a,b)$. Show that if $f'(x) \neq 0$ for all $x \in (a,b)$, then there can exist at most one number $p$ in $[a,b]$ with $f(p) = 0$.
Proof by contradiction plus the Mean Value Theorem does the trick just fine. But a few students tried proving it directly, as follows:
By the MVT, there exists $c \in (a,b)$ such that $$ f'(c) = \frac{f(b)-f(a)}{b-a}.$$ $f'(c) \neq 0$, hence $f(b) \neq f(a)$. Then either $f(a) < f(b)$ or $f(b) < f(a)$. But then $f$ is either strictly increasing or strictly decreasing; hence, there exists at most one zero in $(a,b)$.
Pedantic grading aside, there seems to be a leap of logic here. Due to the many examples of differentiable functions with discontinuous derivatives, I wonder if there is a counterexample to the students' final conclusion. Namely, is there a non-monotonic, everywhere differentiable function whose derivative is nowhere zero?
Note: I need the derivative to exist, so the Weierstrass function (as in this post and this post) doesn't work.
There may also be a simple way to interpret the students' proofs; they're generally poorly written and the above was my best guess at what they meant to say.