I saw this question online
"We define a series $\{a_i\}_{i=0}^\infty$ like so: $a_0 = 1, \; a_{n+1} = sin(a_n)$ prove that $a_n$ converges"
that is rather easy because if $\forall x>0 ,\;sin(x) < x$ that mean $a_n$ is monotonically decreasing and positive and thus converges.
It's pretty obvious $a_n$ converges to $0$, My question is, at what rate does $a_n$ converges to $0$.
Spoiler, I used my computer to show that the answer is probably $\frac{\alpha}{\sqrt{n}}$ where $\alpha \approx 1.732$ but I wasn't able to prove it mathematically