4

Let $r(t)$ be a real-valued function of $t$. Let $v(t)$ be the derivative of $r(t)$. Then $$v(t) = \frac{dr(t)}{dt} = \lim_{\Delta t \to 0} \frac{r(t + \Delta t) - r(t)}{\Delta t}$$ so $$v(t) = \frac{dr(t)}{dt} \approx \frac{r(t + \Delta t) - r(t)}{\Delta t} \text{ for small }\Delta t$$

My question is, is there another way to approximate $v(t) = \dfrac{dr(t)}{dt}$?

For example, I am reading the book Understanding Molecular Simulation by Frenkel and Smit (Second Edition). On page 71 (some pages are available on Google Books here), the authors write $$v(t) = \frac{r(t + \Delta t) - r(t - \Delta t)}{2 \Delta t} + \mathcal{O}(\Delta t^2)$$ or, in other words, $$v(t) \approx \frac{r(t + \Delta t) - r(t - \Delta t)}{2 \Delta t}$$

Basically, then, it seems that there are two ways to express $v(t) = \dfrac{dr(t)}{dt}$:

$$v(t) = \frac{dr(t)}{dt} = \lim_{\Delta t \to 0} \frac{r(t + \Delta t) - r(t)}{\Delta t} \textbf{ (1)}$$

$$v(t) = \frac{dr(t)}{dt} = \lim_{\Delta t \to 0} \frac{r(t + \Delta t) - r(t - \Delta t)}{2 \Delta t} \textbf{ (2)}$$

Are equations (1) and (2) equivalent? Equation (1) Is the definition of derivative that I remember from high school calculus; I do not remember (2). Is (2) an alternative definition of the derivative? Or, what is the relationship between (1) and (2)?

Andrew
  • 379
  • i can think about one thing,they are same according to following situation,consider difference between $5$ and $4$ difference is $1$ while difference between $5$ and $3$ is $2$ or $2*1$,i think the same principle is here – dato datuashvili Aug 07 '12 at 20:33

5 Answers5

7

If the limit (1) exists, the limit (2) exists. To see this, write $$ \frac{r(t+h)-r(t-h)}{2h}=\frac12\left(\frac{r(t+h)-r(t)}{h}+\frac{r(t+(-h))-r(t)}{(-h)}\right). $$ The other implication is wrong, otherwise every even function would be differentiable at $t=0$ with derivative zero (but $t\mapsto|t|$ is a counterexample).

Hence (2) is not a definition of the derivative. The definition of the derivative is (1). But as soon as (1) holds, the derivative coincides with the limit $$ \lim\limits_{h\to0}\frac{r(t+ah)-r(t-bh)}{(a+b)h}, $$ for every $(a,b)$ such that $a+b\ne0$, and in particular with the limit in (2).

Did
  • 279,727
  • Shouldn't this be the accepted answer? Thanks for this view, I hadn't though of it this way – abcd May 07 '20 at 03:26
3

There are many ways to write a modified version of the definition of the derivative that have in fact higher convergence rates and are thus often desirable to use in numerical computation.

Finite difference coefficients can be used for this. For example, we have

$$f'(x_0)=\frac{f(x+h)-f(x)}{h}+O(h)$$

but

$$f'(x_{0}) \approx \displaystyle \frac{-\frac{11}{6}f(x_{0}) + 3f(x_{+1}) -\frac{3}{2}f(x_{+2}) +\frac{1}{3}f(x_{+3}) }{h_{x}} + O\left(h_{x}^3 \right)$$

Addendum: Here is a giant list of high-accuracy derivatives using these coefficients.

Argon
  • 25,303
  • Thanks. Does this mean that equation (2) in the original post is a correct definition of the derivative? – Andrew Aug 07 '12 at 21:14
  • 1
    @Andrew Yes, very much so! This is a basic three-point estimate. In fact, this derivative formula is mentioned here: http://en.wikipedia.org/wiki/Numerical_differentiation#Finite_difference_formula - the error seems to be $$\frac{f^{(3)}(c)h^2}{6}$$ where $$c \in [x-h, x+h]$$ – Argon Aug 07 '12 at 21:17
  • Thanks! So, based on the link you provided, it seems that equation (2), a three-point estimate, is an actually more accurate approximation of the derivative than equation (1), a two-point estimate. Is this true? – Andrew Aug 07 '12 at 21:24
  • 1
    @Andrew Yes, it is true. Instead of having a bias to the side that the limit is taken, this method has a balance. – Argon Aug 07 '12 at 21:27
  • @Andrew Your (2) is (6) from this paper: http://www.karenkopecky.net/Teaching/eco613614/Notes_NumericalDifferentiation.pdf – Argon Aug 07 '12 at 21:32
  • 1
    "...thus often desirable to use in numerical computation." - not always. To borrow a mantra from a certain popular book on numerics, high order does not necessarily imply high accuracy. One must remember that the basis of the high-order approximations is the use of an interpolating polynomial at equispaced points, and it is well-known how unruly high-order interpolating polynomials can be... – J. M. ain't a mathematician Aug 07 '12 at 23:37
3

As a complement to other fine answers (did's for example) let's suppose that $f$ admits a Taylor expansion at $t$ then :

$$f(t+h)=f(t)+hf'(t)+\frac {h^2}2 f''(t)+\frac {h^3}6 f'''(t) +O\left(h^4 f''''(t)\right)$$

so that : $$\tag{1}\frac{f(t+h)-f(t)}h=f'(t)+\frac {h}2 f''(t)+\frac {h^2}6 f'''(t) +O\left(h^3 f''''(t)\right)$$ while : $$\tag{2}\frac{f(t+h)-f(t-h)}2=f'(t)+\frac {h^2}6 f'''(t) +O\left(h^3 f''''(t)\right)$$

That is why the second method is more precise : the $f''$ term disappeared (well all the even terms in fact!) so that $f'$ will be evaluated with more precision (with an $h^2$ error instead of $h$). Because of the precision obtained Feynman proposed this second method to evaluate derivatives in his famous Physics Lectures (Vol I 9-6).

But when there is no convergence ($f(t)=\frac 1t$ for example) you'll get a limit in the second case ($0$) without any problem and that's clearly different of the first case.

The second method is much used too when you need a 'discretized' version of a differential equation (respecting time symmetry, energy conservation and so on). Ed Fredkin for example proposed following equation in his article "Feynman, Barton and the reversible Schrödinger difference equation" : $$\frac{C_{x,t+1}-C_{x,t-1}}2=ik\left(C_{x-1,t}-2C_{x,t}+C_{x+1,t}\right)$$

Raymond Manzoni
  • 43,021
  • 5
  • 86
  • 140
2

Consider the linear terms in the Taylor series of $r(t+h)$ and $r(t-h)$, where $h = \Delta t$ and $r'(t)=v(t)$, $$ r(t+h) \approx r(t) + r'(t)\,h $$ and $$ r(t-h) \approx r(t) - r'(t)\,h $$ Multiplying the second equation by $-1$ and adding $$ 2h r'(t) = r(t+h) + r(t-h) \Rightarrow v(t) \approx \frac{r(t+h) + r(t-h)}{2h}\,, $$

which implies

$$ v(t) = \lim_{h \to 0} \frac{r(t+h) + r(t-h)}{2h} $$

0

Here is yet another equivalent definition: $f'(x) = \lim_{r \to 1} \frac{f(rx) - f(x)}{(r-1)x}$, if $x \ne 0$. The differentiation rule $\frac{d}{dx}x^n = n x^{n-1}$ is especially easy to derive with this definition.

Hans Engler
  • 15,439