Newton's method for approximating roots of $f(x)=0$ is given by $$x_{n+1}=x_n-\frac{f(x_n)}{f'(x_n)}, \quad n=0,1,2,\ldots,$$ where $x_n$ denotes the $n$th approximation of the root, and some sufficiently good initial approximation $x_0$ is required for convergence.
In general, this converges quadratically, but if the root, say $x^*$, satisfies $f'(x^*)=0$, the convergence is only linear. I'm aware that quadratic convergence can be restored by using the scheme $$x_{n+1}=x_n-\frac{F(x_n)}{F'(x_n)},$$ where $F(x_n)=f(x_n)/f'(x_n)$.
After substituting the definition of $F(x_n)$, it can be seen that $$x_{n+1}=x_n-\frac{f(x_n)f'(x_n)}{f'(x_n)f'(x_n)-f(x_n)f''(x_n)}.$$
I'm told that this raises another issue, namely that the denominator involves computing the difference of two nearly equal quantities and can lead to large rounding errors.
Questions
- Why are the quantities on the denominator nearly equal?
- How do we get around this issue?