0

Reading my calculus notes I found the following exercise:

Let $\displaystyle a_1=\alpha$ and $a_{n+1}=\frac{a_n+ \beta/a_n}{2}$ for $n\geq1$, where $\alpha>0$ is arbitrary and $\beta$ is a positive fixed number. Show that $(a_n)\to\sqrt{\beta}$ when $n\to\infty$

I've been stuck for a while trying to solve this problem. I don't really know how to approach it. I tried using induction since the sequence is defined recursively but I didn't get the desired result. Any help is appreciated.

user926356
  • 1,250
  • 1
    Not exactly the same but very similar usecase, look at https://math.stackexchange.com/q/3538354/399263 – zwim Sep 10 '20 at 20:15
  • 1
    When you are trying to find the limit of a sequence, try setting $a_{n+1}=a_n$ and solve for $a_n$. If the limit exists and the sequence is monotonic, $a_{n+1}=a_n$ should occur when $n\to\infty$ – Alan Abraham Sep 11 '20 at 04:28

1 Answers1

1

Finding the possible limit is relatively straightforward: $l=\dfrac{l+\beta/l}2\implies 2l^2=l^2+\beta\implies l=\sqrt\beta$.

To show this is indeed the limit, try showing you have a bounded monotone sequence.