Question: Let $a_n$ be a sequence given by a starting seed $s$, with $$ a_0 = s$$ an $$ a_n = 1 + \frac{1}{a_{n-1}} $$ Does $a_n$ converge for every $s \in \mathbb{R}$?
This question comes from the fraction tower $1+\frac{1}{1+\frac{1}{1+\frac{1}{1+...}}}$. By choosing a starting seed we may discuss the behavior (convergence) of the sequence $a_n$. Clearly if the sequence converges, it must be either $\phi$ or $\frac{-1}{\phi}$. Using ratio test $$\left|\frac{a_n - a_{n-1}}{a_{n-1} - a_{n-2}}\right| = \left|\frac{-1}{a_{n-2}+1}\right|$$ I found out that $\phi$ is the only possible solution except when $s=\frac{-1}{\phi}$(in which case would imply that $a_{n-1}-a_{n-2}$ would be $0$ for some $n$, and imply that the sequence is precisely $a_n = \frac{-1}{\phi}$.). But I have yet to prove that any starting seed $s \in \mathbb{R}$ except $\frac{-1}{\phi}$ would induce a convergent sequence (but I found some posts on MSE only discuss the convergence of the case where $s=1$ using the monotonicity).
How can I prove the statement above? (or is it correct?) Thanks in advance.
Some $s$ would make some term of the sequence undefined, namely $0, -1, -\frac{1}{2}, ...$, so one should exclude these cases.