We are assigned to deal with the following task.
Assume that $f(x)$ is derivable for any $x \in \mathbb{R}$. We want to research $\lim\limits_{x \to x_0}f'(x)$ where $x_0 \in \mathbb{R}$.
Notice that \begin{align*} f'(x_0)=\lim_{x \to x_0}\frac{f(x)-f(x_0)}{x-x_0}=\lim_{x \to x_0}\frac{f'(\xi)(x-x_0)}{x-x_0}=\lim_{x \to x_0}f'(\xi), \end{align*} where we applied Lagrange's Mean Value Theorem, and $ x_0 \lessgtr \xi \lessgtr x.$ Since $\xi$ is squeezed by $x_0$ and $x$, then $x \to x_0$ implies $\xi \to x_0$. Thus $$f'(x_0)=\lim_{x \to x_0}f'(\xi)=\lim_{\xi \to x_0}f'(\xi).$$
What does this say? It shows that $f'(x)$ is always continuous at any point $x=x_0$, which is an absurd conclusion, because we know safely $f'(x)$ may probably has the discontinuity point (of the second kind). But where dose the mistake occur during the reasoning above?