I have trouble understanding why, when finding the limit of a recurrently defined sequence, we can assume that $x_n=x_{n+1}$ to find the actual limit.
I figured it's got something to do with the fact that $\lim\limits_{n \to \infty} x_n=\lim\limits_{n \to \infty} x_{n+1}$ and intuitively that makes perfect sense, but it's the theory behind it that evades my understanding (we can't just say that "because $|x_n-x_{n+1}|$ for high enough $n$ is very close to zero, we might aswell make them equal", right?)
Example: In $x_1=0$ , $ x_{n+1}=\frac{1}{1+x_n}$ I'd use the equality $ L=\frac{1}{1+L}$
This question probably isn't very well formulated, but that comes from my lack of understanding of the problem, sorry! (and thanks for the answer :)
We assume that $\lim_{n} u_{n}$ exists, namely, $L \in \mathbb{R}$. Let $\epsilon > 0$. By our assumption, there exists natural number $K$ such that for each $n \geq K$, we have $|u_{n} - L| < \frac{\epsilon}{2}$. Then $|u_{n} - u_{n+1}| \leq |u_{n} - L| + |u_{n+1} - L| < \epsilon$.
Is this correct? If it is correct, when we use the condition $\lim_{n \to \infty}(u_{n} -u_{n+2}) = 0$?
– Mr.Lilly Jul 15 '16 at 17:11