0

Let $g:\mathbb{R}\rightarrow\mathbb{R}$ be a function of class $C^2$. Show that $$\lim_{h\rightarrow 0}\dfrac{g(a+h)-2g(a)+g(a-h)}{h^2}=g''(a).$$

It seems like an application of mean-value theorem might help: Since $g$ is differentiable, for any $h>0$ there exists $h_1\in(0,h)$ such that $$g(a+h)-g(a)=h\cdot g'(a+h_1).$$ And then since $g'$ is differentiable, there exists $h_2\in(0,h_1)$ such that $$g'(a+h_1)-g'(a)=h_1\cdot g''(a+h_2).$$ Combining the two we have $$g(a+h)-g(a)=hg'(a)+hh_1g''(a+h_2).$$ which doesn't quite get to the desired expression.

PJ Miller
  • 8,193

1 Answers1

2

Your idea is not off, but doesn't seem to work. I would rather consider using Taylor's Theorem.

Namely, we can write $$g(a+h)-g(a)=hg'(a)+\frac 1 2g''(a)h^2+o(h^2)$$ $$g(a-h)-g(a)=-hg'(a)+\frac 1 2g''(a)h^2+o(h^2)$$

Summing gives that your expression is $$g''(a)+\frac{o(h^2)}{h^2}\to g''(a)$$ as $h\to 0$.

Pedro
  • 122,002
  • If we only know that $g$ is of class $C^2$, can we write down the Taylor series, which includes derivatives of all orders? – PJ Miller Aug 28 '13 at 03:24
  • @PJMiller I am not writing a series, but rather writing up to the second order which is indeed valid by Taylor's Theorem. – Pedro Aug 28 '13 at 03:27