1

Prove convergence of $a_{n+1}=\frac{c}{2}-\frac{ a_n^{2}}{2}$, $a_1=\frac{c}{2}$ for $0<c<1$

I managed to prove it using the contraction mapping theorem, but i want to prove it using more elementary method such as monotonicity + boundedness and such.

Checking for $ c=0.5 $, all the elements in the odd places are a descreasing subsequence and all elements in the even places are increasing subsequence.

Beyond that, didn't achieve much advancement.

amWhy
  • 209,954
  • 1
    Doesn't the standard iterated convergence theorem work? – Calvin Lin Jan 08 '23 at 20:35
  • 1
    Hint: for "spiral convergence" study $f(f(x))-x$ where $f(x)=(c-x^2)/2$. See here and here. – zwim Jan 08 '23 at 20:39
  • If you want to prove it without a fixed point theorem, then you could proceed as you suggested. Prove that the odd subsequence is decreasing and the even one is increasing. There are only polynomials involved that can be discussed by elementary means. The boundaries should come automatically with bounded values of c. – Marius S.L. Jan 08 '23 at 21:25
  • @CalvinLin I was looking for a solution for a first semester student – rotem aracky Jan 09 '23 at 14:05
  • @rotemaracky 1) Show that $ c/2 - c^2 /4 < a_n < c/2$. 2) Show that $ |a_{n+1} - a_n | < c |a_{n} - a_{n-1} |$. 3) Hence, the limit exists. $\quad$ This is the ratio test (though disguised), which I'd expect a first-semester analysis student to be able to understand. – Calvin Lin Jan 10 '23 at 22:55

0 Answers0