I would like to know if the following statement is true.
Let $f:(a,b)\subset\mathbb{R} \rightarrow \mathbb{R}$, such $f'$ exits for all $x \in (a,b)$ and also exists $f''(x_0)$ for some $a<x_0<b$. Then
$ \lim_{h \to 0} \frac{f(x_0+h)-2f(x_0)+f(x_0-h)}{h^2} = f''(x_0)$.
Obs: I'm not assuming that $f'$ is continuos in some neighborhood of $x_0$. However, I think that requirement of the existence of $f'$ in a neighborhood of $x_0$ is a necessity for the existence of $f''(x_0)$.