1

I saw this question online

"We define a series $\{a_i\}_{i=0}^\infty$ like so: $a_0 = 1, \; a_{n+1} = sin(a_n)$ prove that $a_n$ converges"

that is rather easy because if $\forall x>0 ,\;sin(x) < x$ that mean $a_n$ is monotonically decreasing and positive and thus converges.

It's pretty obvious $a_n$ converges to $0$, My question is, at what rate does $a_n$ converges to $0$.

Spoiler, I used my computer to show that the answer is probably $\frac{\alpha}{\sqrt{n}}$ where $\alpha \approx 1.732$ but I wasn't able to prove it mathematically

  • How do you define rate of convergence? – gammatester Jul 28 '17 at 15:28
  • 1
    Spoiler?! LOL! I wonder if the 1000th person asking this question will get a bonus. https://math.stackexchange.com/questions/3215/convergence-of-sqrtnx-n-where-x-n1-sinx-n –  Jul 28 '17 at 15:33

1 Answers1

0

By the Taylor approximation,

$$\frac{a_{n+1}}{a_n}\approx1-\frac{a_{n-1}^2}6$$ tends to $1$ and the convergence is sublinear.

As

$$\frac{a_{n+2}-a_{n+1}}{a_{n+1}-a_n}$$ tends to $1$, the convergence is said logarithmic, i.e. very slow.