1

Let $g:\mathbb{R} \to \mathbb{R}$ be of class $C^2$. Show that

$$\lim_{h \rightarrow 0} \frac{g(a+h)-2g(a) +g(a-h)}{h^2} = g''(a)$$

How should one approach such questions? There are so many things that come to mind such as mean value theorem, definition of differentiability and so on but they do not lead anywhere.

Zhanxiong
  • 14,040

1 Answers1

2

I see two ways to solve the problem

  • First way

$$ \frac{g(a+h)-2g(a) +g(a-h)}{h^2}=\frac 1h \Big(\frac{g(a+h)-g(a)} h-\frac{g(a)-g(a-h)} h\Big)$$ Inside brackets, at the limit, we have $(g'(a+h)-g'(a))$

  • Second way

Assuming that the second derivative exists, Taylor expansion $$g(a+h)=g(a)+h g'(a)+\frac{1}{2} h^2 g''(a)+\frac{1}{6} h^3 g'''(a)+O\left(h^4\right)$$ $$g(a-h)=g(a)-h g'(a)+\frac{1}{2} h^2 g''(a)-\frac{1}{6} h^3 g^{(3)}(a)+O\left(h^4\right)$$ which give $$g(a+h)-2g(a)+g(a-h)=h^2 g''(a)+O\left(h^4\right)$$