36

I've been trying to understand how the second order derivative "formula" works:

$$\lim_{h\to0} \frac{f(x+h) - 2f(x) + f(x-h)}{h^2}$$

So, the rate of change of the rate of change for an arbitrary continuous function. It basically feels right, since it samples "the after $x+h$ and the before $x-h$" and the $h^2$ is there (due to the expected /h/h -> /h*h), but I'm having trouble finding the equation on my own.

It's is basically a derivative of a derivative, right? Newtonian notation declares as $f''$ and Leibniz's as $\frac{\partial^2{y}}{\partial{x}^2}$ which dissolves into:

$$(f')'$$ and $$\frac{\partial{}}{\partial{x}}\frac{\partial{f}}{\partial{x}}$$

So, first derivation shows the rate of change of a function's value relative to input. The second derivative shows the rate of change of the actual rate of change, suggesting information relating to how frequenly it changes.

The original one is rather straightforward:

$$\frac{\Delta y}{\Delta x} = \lim_{h\to0} \frac{f(x+h) - f(x)}{x + h - x} = \lim_{h\to0} \frac{f(x+h) - f(x)}{h}$$

And can easily be shown that $f'(x) = nx^{n-1} + \dots$ is correct for the more forthcoming of polynomial functions. So, my logic suggests that to get the derivative of a derivative, one only needs to send the derivative function as input to finding the new derivative. I'll drop the $\lim_{h\to0}$ for simplicity:

$$f'(x) = \frac{f(x+h) - f(x)}{h}$$

So, the derivative of the derivative should be:

$$f''(x) = \lim_{h\to0} \frac{f'(x+h) - f'(x)}{h}$$

$$f''(x) = \lim_{h\to0} \frac{ \frac{ f(x+2h) - f(x+h)}{h} - \frac{ f(x+h) - f(x)}{h} }{h}$$

$$f''(x) = \lim_{h\to0} \frac{ \frac{ f(x+2h) - f(x+h) - f(x+h) + f(x)}{h} }{h}$$

$$f''(x) = \lim_{h\to0} \frac{ f(x+2h) - f(x+h) - f(x+h) + f(x) }{h^2}$$

$$f''(x) = \lim_{h\to0} \frac{ f(x+2h) - 2f(x+h) + f(x) }{h^2}$$

What am I doing wrong? Perhaps it is the mess of it all, but I just can't see it. Please help.

doraemonpaul
  • 16,178
  • 3
  • 31
  • 75
  • 1
    To avoid confusions, I'd wish to point out that the formula (the centered one or the other) is not "true" in the same sense as the formula of the first derivative is, I mean it cannot be used as definition of the second derivative. It can happen that the limit exists but $f''(x)$ doesn't. The formula is true only when $f''(x)$ exists. http://math.stackexchange.com/questions/1298208/using-the-same-limit-for-a-second-derivative/2115448#2115448 – leonbloy Jan 26 '17 at 20:05
  • 2
    "I'll drop the $\lim_{h\to0}$ for simplicity:

    $$f'(x) = \frac{f(x+h) - f(x)}{h}$$" This line is wrong! Derivative is not ratio of differences but is the limit of ratio of differences. Without taking limit, in general, equality does not hold.

    – Danny Pak-Keung Chan Jan 23 '19 at 01:08
  • 2
    I feel that everyone who is looking at this question has to also look at this question (and its accepted answer). It address the formal derivation of the limit here: https://math.stackexchange.com/questions/1298208/using-the-same-limit-for-a-second-derivative – Benjamin Wang Jan 13 '21 at 08:31

3 Answers3

21

The only problem is that you’re looking at the wrong three points: you’re looking at $x+2h,x+h$, and $x$, and the version that you want to prove is using $x+h,x$, and $x-h$. Start with $$f\,''(x)=\lim_{h\to 0}\frac{f\,'(x)-f\,'(x-h)}h\;,$$ and you’ll be fine.

To see that this really is equivalent to looking at $$f\,''(x)=\lim_{h\to 0}\frac{f\,'(x+h)-f\,'(x)}h\;,$$ let $k=-h$; then

$$\begin{align*} f\,''(x)&=\lim_{h\to 0}\frac{f\,'(x)-f\,'(x-h)}h\\ &=\lim_{-k\to0}\frac{f\,'(x)-f\,'(x-(-k))}{-k}\\ &=\lim_{k\to 0}\frac{f\,'(x-(-k))-f\,'(x)}k\\ &=\lim_{k\to 0}\frac{f\,'(x+k)-f\,'(x)}k\;, \end{align*}$$

and renaming the dummy variable back to $h$ completes the demonstration.

Brian M. Scott
  • 616,228
  • 1
    Hah, neat! The answer was lost to me in the very thing I omitted during my work - the limit. It can approach from both ways, how incredibly silly of me and brilliant in expression here. Thank you! – LearningDroid Oct 10 '12 at 05:41
  • @LearningDroid: You’re very welcome. – Brian M. Scott Oct 10 '12 at 05:44
  • 1
    It seems you've used the following: if $a_n\rightarrow0$ and $b_n\rightarrow0$ then $$\lim_{n\rightarrow\infty}\frac{\lim_{n\rightarrow\infty}a_n/b_n}{b_n} = \lim_{n\rightarrow\infty}\frac{a_n}{b_n^2}$$ This needs to be demonstrated no? – Gregory Grant Mar 19 '15 at 21:09
  • @Gregory: I've not used it. Possibly the OP was using it, though use of the corresponding non-sequential result seems more likely. That part of the argument, however, was not the point of the question, so there was no reason to address it. – Brian M. Scott Mar 19 '15 at 22:05
  • 2
    Thanks, but I don't see how to use your hint without using this fact about limits. The OP also uses this fact, although not explicitly, he sneaks it past when he replaces $f'(x)$ with $\frac{f(x)+h}{h}$ (no limit). – Gregory Grant Mar 20 '15 at 01:25
  • @Gregory: My answer was not a hint; it completely addressed the OP's question. Yes, the OP is using a fact about limits closely related to the one in your comment, possibly consciously possibly not. It doesn't matter: that wasn't the point of the question. We don't even know what level of rigor the OP was looking for. – Brian M. Scott Mar 20 '15 at 01:39
  • 2
    It seems to me a complete answer using your approach would have to address that fact about limits, no? The OP asked what he was doing wrong. Since he's trying to do a proof, not addressing that point is wrong. A complete answer needs to explain why the inner limit can simply be ignored. – Gregory Grant Mar 20 '15 at 03:34
  • @Gregory: No. The OP wanted to know why his approach wasn’t yielding the desired result, and I told him. As I said, we have no way to know how much rigor he was after; his stated goal was to understand the formula works, not to come up with a rigorous proof. And even if he wanted the latter, there’s no guarantee that he gave us every detail of his argument: he may have given just enough to show how it was failing him. – Brian M. Scott Mar 20 '15 at 07:41
  • 3
    I'm not knocking your answer, I was simply searching for an answer to this problem, found this page, and realized a full answer is here, sans that one detail. – Gregory Grant Mar 20 '15 at 11:43
12

Using the Taylor series expansions of $f(x+h)$ and $f(x-h)$,

$$ f(x+h) = f(x) + f'(x)h+f''(x)\frac{h^2}{2} + f'''(x)\frac{h^3}{3!}+\cdots $$

$$ f(x-h) = f(x) - f'(x)h+f''(x)\frac{h^2}{2} - f'''(x)\frac{h^3}{3!}+\cdots $$

Adding the above equations gives

$$ \frac{f(x+h) - 2f(x) + f(x-h)}{h^2} = f''(x) + 2\frac{f''''(x)}{4!}h^2+\cdots $$

taking the limit of the above equation as $h$ goes to zero gives the desired result

$$ \Rightarrow f''(x) = \lim_{h\to0} \frac{f(x+h) - 2f(x) + f(x-h)}{h^2} \,.$$

  • 2
    Thank you very much, this adds much perspective, but Mr. Scott managed to prove it without bringing in Taylor Series, which while correct, complicate the proof process. :D – LearningDroid Oct 10 '12 at 05:43
  • @LearningDroid: You are welcome. – Mhenni Benghorbal Oct 10 '12 at 06:10
  • 18
    There's one thing I am not completely sure about your prove. You say: $$f''(x)=\lim_{h\to0} \frac{ \frac{ f(x+2h) - f(x+h)}{h} - \frac{ f(x+h) - f(x)}{h} }{h}$$ but shouldn't it be $$f''(x) = \lim_{h\to0} \frac{ \lim_{h_1\to0} \frac{ f(x+h) - f(x+h-h_1)}{h} - \lim_{h_2\to0} \frac{ f(x) - f(x-h_2)}{h} }{h}$$ why can you assume the all, $h_1, h_2$ and $h$ are the same. Moreover why can you take them to zero at the same rate? –  Jan 11 '14 at 21:35
  • 4
    $f'''$ may not exist. You assume too much about $f$. – Danny Pak-Keung Chan Jan 23 '19 at 01:10
1

Your formula is correct. You can easily check it by using Taylor (or, more formally, if you only have second derivatives, a second order Mean Value Theorem): $$\begin{multline} \frac1{h^2}\left[f(x+2h)-2f(x+h)+f(x)\right]= \\ \frac1{h^2}\left[ f(x)+2hf'(x)+\frac{4h^2}2f''(x)+o(h^3)-2(f(x)+hf'(x)+\frac{h^2}2f''(x)+o(h^3))+f(x)\right] = \\ \frac1{h^2}\,h^2f''(x) +o(h)=f''(x)+o(h). \end{multline}$$ Your deduction is a little shaky, though, as you are unifying two limits into one without justification. The same argument works, and gives you the formula you wanted, if you start with $$ \frac1{h^2}\left[f(x+h)-2f(x)+f(x-h)\right] $$

Martin Argerami
  • 205,756
  • It seems that you assume $f(x+2h) = f(x) + f'(x)\cdot 2h + \frac{1}{2} f''(x) (2h)^2$. However, this is not correct. The equality is only "approximately equal" unless you consider the remainder terms. – Danny Pak-Keung Chan Jan 23 '19 at 01:14
  • 1
    It is still incorrect if we just know that $f''(x)$ exists but nothing more. The usual Lagrange remainder form requires that $f^{(3)}$ exists in a neighborhood of $x$. However, in our case, we do not know even if $f''$ exists for points other than $x$. – Danny Pak-Keung Chan Jan 24 '19 at 00:30