6

When using numerical analysis, I often find that I am required to estimate a derivative (e.g. when using Newton Iteration for finding roots). To estimate the first derivative of a function $f(x)$ at a point $x_0$ (assuming that $f(x)$ is continuous at $x_0$), one can use the slightly-modified (to avoid bias to one side) first principles formula for derivatives, shown below.

For small $h$:

$$f'(x_0)\approx\frac{f(x+h)-f(x-h)}{2h}\tag{1}$$

Using this method, we can estimate $f^{(n)}(x)$ recursively with, for sufficiently small $h$:

$$f^{(n)}(x_0)\approx\frac{f^{(n-1)}(x+h)-f^{(n-1)}(x-h)}{2h}\tag{2}$$

The problem I have with $(2)$ is that each recursion produces a loss of accuracy that builds up. As well, to estimate $f^{(n)}(x_0)$, the function $f(x)$ is required to be computed $2^n$ times.

Is $(2)$ the best method for approximating the $n^{th}$ derivative of $f(x_0)$ numerically or are there more efficient methods?

lennon310
  • 628
Argon
  • 25,303
  • 3
    http://en.wikipedia.org/wiki/Finite_difference_coefficients –  Apr 10 '12 at 22:29
  • 2
    This is related to William's suggestion. The standard method is, more or less, to fix the same $h$ and use it for all of the derivatives. If nothing else, this means that you only have to compute $f$ at $O(n)$ points, since there will be a lot of redundancies. Likewise, doing the computation theoretically (with formulas) and then plugging in the resulting formula yields a drastic speedup in computational speed. – Charles Staats Apr 10 '12 at 22:47
  • 1
    The formula you gave requires $2n+1$ function evaluations to estimate $f^{(n)}(x)$, not $2^n$. – copper.hat Apr 10 '12 at 23:44
  • 1
    In fact, only $n+1$ evaluations are required. Induction gives: $f^{(n)}(x)\approx \frac{1}{(2h)^n} \sum_{k=0}^n \binom{n}{k} (-1)^k f(x+(n-2k)h)$. – copper.hat Apr 11 '12 at 05:38

1 Answers1

6

Yes, there are much better methods for computing $n$-th derivatives than simple-minded finite differences. I mentioned some of them in this MO answer.

Briefly: one could pick from

  1. Richardson extrapolation of a suitable sequence of finite difference estimates (discussed in these two papers).
  2. Cauchy's differentiation formula: $$f^{(n)}(a)=\frac{n!}{2\pi i}\oint_\gamma \frac{f(z)}{(z-a)^{n+1}}\mathrm dz $$
  3. Lanczos's formula: $$f^{(n)}(a)=\lim_{h\to 0}\frac{(2n+1)!}{2^{n+1}n!h^n}\int_{-1}^1 f(a+hu)P_n(u)\mathrm du$$

where $P_n(x)$ is a Legendre polynomial.


Even simple-minded finite differences can be saved somewhat; for instance, in the case of the first derivative, when one uses central differences

$$f^\prime (x)\approx\frac{f(x+h)-f(x-h)}{2h}$$

one good choice of $h$, due to Nash, takes $h=\sqrt{\varepsilon}\left(|x|+\sqrt{\varepsilon}\right)$ where $\varepsilon$ is machine epsilon. (I had previously mentioned this in one of OP's previous questions...)

  • Warning: self promotion. In case of Lanczos-like approach a choice of a higher precision order kernel function (not just Legendre polynomial) is a strongly preferred option: it leads to significantly reduced errors. See http://vixra.org/abs/1912.0340 B.t.w. your formula is incorrect, there should be double factorial and numerical factor seem also to be wrong https://www.sciencedirect.com/science/article/pii/S0377042704005217 – F. Jatpil Jan 14 '20 at 08:21