I'm studying numerical analysis and I read in Wikipedia that "If the root being sought has multiplicity greater than one, the convergence rate is merely linear".
The article in Wikipedia also mentions that "if the multiplicity $m$ of the root is known, one can use the following modified algorithm that preserves the quadratic convergence rate: $$x_{n+1} = x_n -m \bigl[f(x_n)/f'(x_n)\bigr].$$
I tried proving this and I couldn't do it (I only managed to prove that convergence rate is at least linear). Any ideas on how can this be proven ?