31

Today, we had a math class, where we had to show, that $a_{100} > 14$ for

$$a_0 = 1;\qquad a_{n+1} = a_n + a_n^{-1}$$

Apart from this task, I asked myself: Is there a closed form for this sequence? Since I didn't find an answer by myself, can somebody tell me, whether such a closed form exists, and if yes what it is?

Did
  • 279,727
FUZxxl
  • 9,307
  • 4
    A little Googling leads to http://mathworld.wolfram.com/MycielskiGraph.html (scroll to the bottom) –  Mar 29 '11 at 19:18
  • Rereading this some years later... I am still bemused that you chose to accept the answer you did accept. Already the other answer posted by the same user is more in line with the site's purpose (even though it cannot pass for a rigorous piece of mathematics either, but at least it does try to explain how to get the result), not to mention answers by two other users... I must be missing something. – Did Sep 02 '16 at 07:20
  • @Did Look at the timestamps. The answer I accepted was the first good one I got. – FUZxxl Sep 02 '16 at 08:08
  • "Look at the timestamps. The answer I accepted was the first good one I got." Interesting argument: since three other answers were already posted when you accepted this one, I understand that you systematically accept the first "good" answer you receive, on principle? Then you should mention the fact explicitely, to avoid that poor souls lose their time answering your questions for nothing. You might also make apparent somewhere that mathematical justifications are entirely irrelevant for an answer to be declared "good" by you, since the whole site kind of assumes the opposite. – Did Sep 02 '16 at 08:16
  • 1
    @Did I don't “systematically accept the first good answer I get.” I did that time and that was five years ago. I was hoping for a closed form and Robert Israel's answer was the closed to that. I did however upvote the other answers. – FUZxxl Sep 02 '16 at 08:22
  • You do exactly as you wish. For explanations about how what you do departs from the site's standards, see my previous comments. – Did Sep 02 '16 at 08:25

6 Answers6

29

A closed form I doubt there is. But asymptotics are easy: $$ a_{n+1}^2=a_n^2+2+1/a_n^2, $$ hence, for every $n\ge1$, $$ a_n^2=2n+1+\sum_{k=0}^{n-1}\frac1{a_k^2}.\qquad\qquad\qquad\qquad (*) $$ This shows that $a_n^2\ge2n+2$ for every $n\ge1$, for example $a_{100}\ge\sqrt{202}>10\sqrt{2}>14$. In particular, $a_n\to+\infty$. Plugging this into $(*)$ yields $a_n^2=2n+1+o(n)$ hence $$ \sqrt{2n}\le a_n\le\sqrt{2n}+o(\sqrt{n}). $$ At this point, we know that $a_n^2\ge2n+2$ for every $n\ge1$. Using $(*)$ again, one sees that, for every $n\ge1$, $$ a_n^2\le2n+2+\sum_{k=1}^{n-1}\frac1{2+2k}\le2n+2+\frac12\log(n). $$ Which already shows that $$14.2<a_{100}<14.3$$ Plugging this upper bound of $a_n^2$ into $(*)$ would yield a refined lower bound of $a_n^2$. And one could then plug this refined lower bound into $(*)$ again to get a refined upper bound. And so on, back and forth between better and better upper bounds and better and better lower bounds. (No more asymptotics here.)

Did
  • 279,727
  • I just tried to understand this answer again, but I didn't understood, why $2n+1+\sum_{0\le k<n}a_k^{-2}=2n+1+o(n)$ – FUZxxl May 10 '11 at 20:15
  • 3
    The result is quite general: for every nonnegative sequence $(x_n)$ such that $x_n=o(1)$, $\sum_{k\le n}x_k=o(n)$. You could try to prove this by the usual epsilon-delta method. – Did May 11 '11 at 05:30
  • @FUZxxl: what's more, since $a_k>\sqrt{2k+1}$, we have that $\sum_{0\le k<n}a_k^{-2}<\frac{1}{2}\gamma+\log(2)+\frac{1}{2}\log(n)+\frac{1}{48n^2}$ – robjohn Aug 16 '11 at 12:07
16

I agree, a closed form is very unlikely. As for more precise asymptotics, I think $a_n = \sqrt{2n} + 1/8\,{\frac {\sqrt {2}\ln \left( n \right) }{\sqrt {n}}}-{\frac {1}{ 128}}\,{\frac {\sqrt {2} \left( \ln \left( n \right) -2 \right) ^{2} + o(1)} {{n}^{3/2}}}$

Robert Israel
  • 448,999
12

To elaborate on how I got my answer: I started with @Did's $a(n) \approx \sqrt{2n}$ and looked for a next term. $a(n) = \sqrt{2n}$ would make $ a(n+1) - (a(n) + a(n)^{-1}) = \sqrt {2\,n+2}-\sqrt {2}\sqrt {n}-1/2\,{\frac {\sqrt {2}}{\sqrt {n}}} = - \frac{\sqrt{2}}{8} n^{-3/2} + O(n^{-5/2})$. With $a(n) = \sqrt{2n} + c n^{-1/2}$ I don't get a change in the $n^{-3/2}$ term, so I tried $a(n) = \sqrt{2n} + c \ln(n) n^{-1/2}$ and got $a(n+1) - (a(n) + a(n)^{-1}) = (-\frac{2}{\sqrt{8}} + c) n^{-3/2} + \ldots$. So to get rid of the $n^{-3/2}$ term I want $c = \frac{2}{\sqrt{8}}$. Then look at the leading term for $a(n) = \sqrt{2n} + \frac{2}{\sqrt{8}} \ln(n) n^{-1/2}$ and continue in that vein...

Did
  • 279,727
Robert Israel
  • 448,999
  • Not sure this try-and-guess procedure constitutes a proof. The argument in my post proves that $a_n=\sqrt{2n}+o(\sqrt{n})$. – Did Apr 01 '11 at 06:36
10

From my answer here: Given $a_{1}=1, \ a_{n+1}=a_{n}+\frac{1}{a_{n}}$, find $\lim \limits_{n\to\infty}\frac{a_{n}}{n}$

Reposting here, as it is kind of lost in that thread and this thread is more suitable for it.

Note: I have no clue if a closed form exists, but here is an asymptotic estimate...

I think we can show that $$\displaystyle a_{n}^2 \sim 2n + \dfrac{\log n}{2} - C$$ for some constant $\displaystyle C \gt 0$

By $\displaystyle x_n \sim y_n$ I mean $\displaystyle \lim_{n \to \infty} (x_n - y_n) = 0$

Consider $b_n = a_{n}^2 - 2n$

Then we have that $\displaystyle b_{n+1} = b_n + \dfrac{1}{b_n + 2n}$

Notice that $b_0 \gt 0$ and thus $\displaystyle b_n \gt 0$.

(Note that the other thread linked above starts with $a_1 = 1$ and not $a_0 = 1$.)

We can easily show that $b_n \lt 2 + \log n$, as

$b_{n+1} - b_n = \dfrac{1}{b_n + 2n} \lt \dfrac{1}{2n}$

Adding up gives us the easy upper bound. Note, even though we can give tighter bounds, this is sufficient for our purposes.

Now we have that, for sufficiently large $\displaystyle m,n$

$\displaystyle b_{m+1} - b_n = \sum_{k=n}^{m} \dfrac{1}{b_k + 2k}$

we have that

$\displaystyle \sum_{k=n}^{m} \dfrac{1}{2k} \gt b_{m+1} - b_n \gt \sum_{k=n}^{m} \dfrac{1}{2k}(1- \dfrac{b_k}{2k})$

(Here we used $\displaystyle \dfrac{1}{1+x} \gt \ \ 1-x, 1 \gt x \gt 0$)

Now Since $b_k \lt 2 + \log k$, we have that

$\displaystyle \sum_{k=n}^{m} \dfrac{1}{2k} \gt b_{m+1} - b_n \gt \sum_{k=n}^{m} \dfrac{1}{2k} - \sum_{k=n}^{m} \dfrac{2 + \log k }{4k^2}$

Using the fact that $\displaystyle H_m - H_n = \log(\dfrac{m+1}{n}) + O(\dfrac{1}{n}) + O(\dfrac{1}{n} - \dfrac{1}{m})$, where $\displaystyle H_n = \sum_{k=1}^{n} \dfrac{1}{k}$ is the $\displaystyle n^{th}$ harmonic number.

We see that,

if $c_n = b_n - \dfrac{\log n}{2}$, then

$\displaystyle O(\dfrac{1}{n} -\dfrac{1}{m}) + O(\dfrac{1}{n}) \gt c_{m+1} - c_n \gt O(\dfrac{1}{n} -\dfrac{1}{m}) + O(\dfrac{1}{n}) -\sum_{k=n}^{m} \dfrac{2 + \log k }{4k^2}$

Now $\displaystyle \sum_{k=1}^{\infty} \dfrac{2 + \log k}{k^2}$ is convergent and so by the Cauchy convergence criteria, we have that $\displaystyle c_n$ is convergent.

Thus the sequence $\displaystyle a_{n}^2 - 2n - \dfrac{\log n}{2}$ converges and hence, for some $\displaystyle C$ we have that

$$\displaystyle a_{n}^2 \sim 2n + \dfrac{\log n}{2} - C$$

or in other words

$$\displaystyle a_{n} \sim \sqrt{2n + \dfrac{\log n}{2} - C}$$

A quick (possibly incorrect) computer simulation seems to show a very slow convergence to $\displaystyle C = 1.47812676429749\dots$

Note: Didier suggested an alternate proof in the comments below, which might simpler.

Aryabhata
  • 82,206
  • That's still only an asymptotic. – FUZxxl Mar 29 '11 at 19:33
  • 1
    @Fuz: Yes, I doubt if there is a known closed form for it. But being able to peg down the difference from $2n + \log n/2$ to be a constant should be reasonably useful, I would say. – Aryabhata Mar 29 '11 at 19:42
  • @Moron This is to answer to the question you asked in a comment on Robert's post. First, do we agree that you ask me to check the mathematical accuracy of your answer? And that I only do that and state publicly the result of the checking because you asked me to? If I misunderstood you, please say so and I will delete my comment right away. So... here we go. To summarize, the general line of your proof is correct but some details annoy me, most often because you make some unnecessary détours where a more direct path exists. .../... – Did Apr 02 '11 at 10:04
  • .../... For example: (1) $b_n>0$ for every $n\ge0$ and not only for $n\ge3$. (2) The easy argument I can think of to bound $b_n$ is based on the inequality $b_n+2n\ge1$ for every $n\ge0$, it leads to $b_n\le n+1$ and not to $b_n<2n$. Comment on (2): as a consequence, when I read that you will use $b_n<2n$ and that there is an easy argument to prove it, I am puzzled and worried that I missed something or that my argument leading to $b_n\le n+1$ is wrong, so, ultimately, I am distracted. And this is bad. :-) .../... – Did Apr 02 '11 at 10:05
  • .../... (3) The easy argument I can think of to bound $b_n$ by a $\log$ is based on he inequality $b_n+2n\ge2n$ for every $n\ge1$ and leads to an upper bound of leading term $\frac12\log(n)$ and not $\log(n)$. So the comment on (2) applies here as well, mutatis mutandis. (3') You should expand this step. (4) The bound $b_k<\log(k)$ is not useful the first time you invoke it (but it is useful the second time). (5) I would postpone the introduction of $\log(n)$ to as late a point as possible. If you define $c_n$ by $c_n=b_n-\frac12H_n$, $(c_n)$ is nonincreasing and bounded below by .../... – Did Apr 02 '11 at 10:06
  • .../... the series you know hence $(c_n)$ converges by a simpler and neater (according to me) argument than the one you have to use for your $(c_n)$. (6) I do not know what your notation $O(1/n-1/m)$ means. Rather I am not sure that what you mean is correct (and that annoys me, and I am distracted and... see above :-)). You could replace this term by $2/n$ everytime you need it (but if you follow my other advices you never need it). .../... – Did Apr 02 '11 at 10:06
  • .../... As a consequence of these modifications, at the very end of the proof, you now conclude by replacing $H_n$ by $\log(n)+\gamma+o(1)$--et voilà! OK, as I said, your proof is basically correct, but you asked for my advice, so what I wrote is more akin to a comment on an examination paper than to standard math.SE comments. Tell me and I delete the whole thing. – Did Apr 02 '11 at 10:06
  • @Didier: Thank you for your comments. Finding flaws (whether of accuracy or of presentation) in a proof is a good thing, and I would not mind your comments, even if I hadn't asked whether you found any flaws or not. Thank you again, for the detailed comments, and I will need to go through more carefully to incorporate the suggestions you make. – Aryabhata Apr 02 '11 at 16:00
  • @Didier: Actually, this was just a copy of the answer to the other problem where we start with $a_1 = 1$, so for that $b_1 \lt 0$. In this case, we start with $a_0 = 1$. Also the proof of $b_k \lt \log k $ uses $b_{n+1} - b_n = 1/(b_n +2n) \lt 1/2n$. I just used a conservative estimate (which is good enough to prove convergence) and I was being lazy to avoid typing the /2. Also, $\mathcal{O}(1/n - 1/m)$ is valid notation for BigOh in two variables. See this for instance: http://en.wikipedia.org/wiki/Big_O_notation#Multiple_variables. – Aryabhata Apr 02 '11 at 17:33
  • I will make some edits as per your suggestion. Thanks. – Aryabhata Apr 02 '11 at 17:34
  • @Moron Thanks for the link about multidimensional BigOh notation. In the case at hand, I still think there is a problem: if $m=n+k$ with $k\ge1$ fixed, when $n\to+\infty$, $H_m-H_n=k/n+o(1/n)$, the log on the RHS is $(k+1)/n+o(n)$ and $1/n-1/m=o(1/n)$, hence the $1/n$ terms do not coincide. :-) – Did Apr 02 '11 at 18:46
  • @Didier: Yeah, I suppose it is $\log(m/n)$ not, $\log((m+1)/n)$. Again, I was just being lazy and wanted to avoid typing later, if I recollect correctly. Adding an extra $\mathcal{O}(1/n)$ does not change the proof though. Thanks :-) – Aryabhata Apr 02 '11 at 22:35
  • @Aryabhata is there a progress in finding the correct value of your constant $C$ ? – Shivam Patel May 25 '14 at 03:43
  • @ShivamPatel: No :-( didn't work on this after I posted this. – Aryabhata Jun 14 '14 at 03:18
5

Let us consider the functional formulation. Given $y(0)=1$, $y' = \frac{1}{y}$ yields $ y(x) = \sqrt{2x+1}$.


I am not saying $a(n)=y(n)$. Yet there is a link between the two approaches (finite difference).

Wok
  • 1,923
  • 1
    What are you saying? That $a_n=y(n)$? Clearly not, since $a_n$ is always rational and $y(n)$ is typically irrational. There's no reason why replacing a difference by a derivative should work. – joriki Mar 29 '11 at 19:28
  • 4
    @joriki: Actually, it could be useful. If I recollect correctly, Donald J Newman recommends this as a rough method to guess the asymptotic behaviour of certain sequences, in one of his books. – Aryabhata Mar 29 '11 at 19:39
  • @joriki Indeed one can compare the solution $y$ of the ODE $y'=\varphi(y)$ with the sequence $(a_n)$ defined by $a_{n+1}=a_n+\varphi(a_n)$. For $\varphi(y)=-2y$ the result is interesting. – Did Apr 04 '11 at 17:51
  • @Moron See comment above. – Did Apr 04 '11 at 17:52
  • @Didier: Thanks for the example :-) – Aryabhata Apr 04 '11 at 17:58
  • I guess positivity of $\varphi$ is one sufficient condition to get the correct behavior. Only a guess though. – Wok Sep 04 '14 at 07:29
3

Let $b_n=a_n^{-1}$, so that $b_{n+1}=(b_n+b_n^{-1})^{-1}$ is the $n$th iterate of $$f(x)=(x+x^{-1})^{-1}=x-x^3+x^5-x^7+\cdots$$ The correct asymptotics have been given elsewhere, but this answer is to point out that there is a general method applicable to any function $f$ analytic at 0 with a series expansion $x+a_kx^k+a_{k+1}x^{k+1}+\cdots$ with $k>1$ and $a_k<0$: I learned about it in Chapter 8 of de Bruijn's Asymptotic Methods in Analysis, where it is shown that if the starting value is small enough, the $n$th iterate of $f$ is asymptotically equivalent to $((1-k)a_kn)^{-1/(k-1)}$. In our case, $k=3$ and $a_k=-1$ so $b_n \sim (2n)^{-1/2}$ and $a_n \sim (2n)^{1/2}$.

A sketch of the proof in the case at hand is as follows. First, show by induction that if $u_n$ is any sequence tending to zero and such that $$u_{n+1}=u_n-u_n^2+O(u_n^3)$$ then $u_n = n^{-1} + O(n^{-2}\log n)$. This is the tricky part of the proof.

Next make a substitution $z_n= 2b_n^2$. We have $$ b_{n+1}=b_n(1-b_n^2+b_n^4-\cdots)$$ so $$z_{n+1}=z_n\left(1-\frac{1}{2}z_n + \frac{1}{4}z_n^2+ \cdots\right)^2 = z_n-z_n^2 + \frac{3}{4}z_n^3 + \cdots$$ from which we get $z_n = n^{-1}+O(n^{-2}\log n)$ and the asymptotics for $b_n$ follow.

Incidentally, the sequence $b_n$ is what you get by using Newton's method to find a root of $xe^{-x^{-2}/2}$. But I couldn't find a way to make that shed any light on the asymptotics...