Let $f: \mathbb{R}^n \to \mathbb{R}$ . If $f$ is at least $C^2$, Taylor's Theorem tells us that
$$f(x+p) = f(x) + p^T \nabla f(x) + \dfrac 12p^TH_f(x+tp)p$$
where $t \in (0, 1)$. Now, we may approximate $f$ by
$$m(p) = f(x) + p^T \nabla f(x) + \dfrac 12p^TH_f(x)p.$$
My book says that if $H_f$ is sufficiently smooth, then the difference between $m$ and $f$ is $O(||p||^3$) as $p \to 0$. However, I am trouble deriving this. I tried the $1$-variable case, $f: \mathbb{R} \to \mathbb{R}$. In this case,
$$|f(x+p) - m(p)| = |f''(x+tp) - f''(x)|$$
Now if we assume that $f''$ is diferentiable on the interval $\mathcal{I}[x, tp]$, by the Mean Value Theorem, there exists a $c$ in that interval such that $$|f''(x+tp) - f''(x)| = |tp||f'''(c)|.$$
If we further assume $f'''$ is continuous, we get that $|f''(x+tp) - f''(x)|=O(|p|)$, which is nowhere close to $O(|p|^3)$.
Any help is greatly appreciated.