As part of the problem I'm working on, I reached the point where I have to show the sequence of error terms $e_n$ defined by: $$ e_{n+1} = \frac{e_n}{e_n+2} $$ converges to 0 for choice of initial $e_0 > -1$
I've been able to show this for $e_0 \geq 0$, as $e_n \geq 0 \implies 0 < e_{n+1} \leq \frac{1}{2}e_n$
How can one show that convergence to $0$ still holds for $-1 < e_0 < 0$?
Is there a way to prove this using only the non-explicit definition of $e_n$?