I recently derived Darboux's Theorem, which states that:
If $f$ is differentiable on $[a,b]$ and there is a $c$ such that $f'(a) \lt c \lt f'(b)$, then there is an $x \in (a,b)$ such that $f'(x)=c$.
Given this theorem (which also holds for the $f'(a) \gt c \gt f'(b)$ variation), I wanted to explore the implications on what the graph of $f'$ can look like given the following condition:
Let $f$ be differentiable in some interval containing $a$, where $f'$ is discontinuous at $a \quad (\dagger)$.
It seems to me that $(\dagger)$ can only be satisfied by graphs that look something like:
or
In these above two graphs, one should interpret the zigzag-ing lines approaching $a$ as representing infinite oscillations. Further, the zigzag-ing lines are NOT converging to $f'(a)$ (i.e. there is some $\varepsilon \gt 0$ such that in any any neighborhood around $a$, there is an $x$ such that $|f'(x)-f'(a)| \geq \varepsilon$ ).
Importantly, the following graph would be disallowed by Darboux's Theorem:
The distinction between (graph 1 / graph 2) and (graph 3) is that the first two graphs have the following property:
For any $\delta \gt 0$ (so long as $a+\delta$ remains confined to a differential interval), for any $y \in [f(a),f(a+\delta)]$, there is an $x \in [a,a+\delta]$ such that $f'(x)=y$. Similarly for $a-\delta$.
(i.e. the first two graphs obey Darboux's Theorem). As you can see, graph 3 has infinitely many neighborhoods where there are $y$ between $f(a)$ and $f(a+\delta)$ that are not mapped into...such that there are no $x \in [a,a+\delta]$ where $f(x)=y$.
Is this the proper understanding?