10

The question is in the title; I'm looking for the exact decay rate. Naturally, assume the starting point $y_0$ belongs to $(0,1)$.

This is motivated by one of the answers to a previous question of mine.

robinson
  • 1,918

4 Answers4

10

Consider the recurrence $\displaystyle x_{n+1} = x_n(1-x_n)$ (sorry changed the notation).

If $\displaystyle y_n = \frac{1}{x_n}$ then we see that

$$y_{n+1} - y_n = \frac{1}{1-x_n}$$

and thus

$$ y_n = y_0 + \sum_{k=0}^{n-1} \frac{1}{1-x_k} $$

Since $\displaystyle 1 - x_n \to 1$ we have that $\displaystyle \frac{1}{1-x_n} \to 1$ and thus $\displaystyle \frac{1}{n} \sum_{k=0}^{n-1} \frac{1}{1-x_k} \to 1$

And thus $\displaystyle \frac{y_n}{n} \to 1$ i.e. $\displaystyle n x_n \to 1$.

Similar technique was used in the excellent answer by David Speyer here: Convergence of $\sqrt{n}x_{n}$ where $x_{n+1} = \sin(x_{n})$

Note that we get your sequence by considering the sequence $\displaystyle x_{n+1} = f(x_n)$ where $\displaystyle f(x) = x - x^2$ and applying the technique in that answer here.

See also: Limit of the sequence $nx_{n}$ where $x_{n+1} = \log (1 +x_{n})$

PS: One can get more accurate estimates by using the above ($\displaystyle x_n \sim \frac{1}{n}$) and passing it again through

$\displaystyle y_n = C + \sum_{k=1}^{n-1} \frac{1}{1-x_k} = C + n + \sum_{k=0}^{n-1} x_k + + \sum_{k=0}^{n-1} \mathcal{O}(x_k^2)$

I believe by doing this we get the estimate

$$y_n = n + \log n + \mathcal{O}(1)$$

Aryabhata
  • 82,206
  • Both last displayed formulas are equivalent to $nx_n\to1$. It seems you mean something else. – Did Jul 06 '11 at 07:16
  • @Didier: The last formulae say more than just $nx_n \to 1$. I don't see how they are "equivalent". The latter formulae imply the former, but not the other way. – Aryabhata Jul 06 '11 at 07:30
  • Then you are using $\sim$ in an unconventional way... You could precise what the statement $u_n\sim v_n$ means in your post, for two given sequences $(u_n)$ and $(v_n)$. – Did Jul 06 '11 at 07:33
  • @Did: Edited. You are right I should have been clearer. – Aryabhata Jul 06 '11 at 18:03
  • OK, no problem. While we are at it, your last $o(n)$ is also a $O(1)$, I think (and in fact, $y_0+\gamma+o(1)$). (Am I mistaken or did you delete some comments of yours?) – Did Jul 06 '11 at 19:35
  • @Didier: I did delete some comments. Yes you are right that it is $\mathcal{O}(1)$. I am not so sure about $y_o + \gamma$ though. The $\mathcal{O}(x_k^2)$ terms do contribute to the constant. – Aryabhata Jul 06 '11 at 19:37
  • Indeed they do. Silly me. – Did Jul 06 '11 at 19:49
5

This is nothing but a repitition of former two answers, but it may summerize some ideas that are used throughout.

One of the critical observation that is needed for us is the Stolz-Cesàro theorem, which is stated as follows:

Theorem (Stolz-Cesàro) If $(a_n)$ and $(b_n)$ are sequence of real numbers such that $b_n \nearrow +\infty$ and $$ \lim_{n\to\infty} \frac{\Delta a_n}{\Delta b_n} = \ell,$$ where $\Delta a_n := a_{n+1} - a_n$ is the forward difference operator. Then $$\lim_{n\to\infty} \frac{a_n}{b_n} = \ell .$$

Now, suppose there is $a > 0$ and $r > 1$ such that $f(x) = x - (a + o(1))x^{r}$ for $0 \leq x \ll 1$. Then for suitably small positive initial condition $x_0$, we find that the sequence $(x_n)$ defined by the recursive formula $x_{n+1} = f(x_n)$ becomes positive and decreasing. Then it converges to some nonnegative number, which will satisfy the relation $x = x - (a + o(1))x^{r}$, hence zero.

Then simple (and much simpler for $r = 2$, as in our case) estimation shows that $$\begin{eqnarray*}\frac{1}{x_{n+1}^{r-1}} - \frac{1}{x_{n}^{r-1}} & = & \frac{1 - \left( x_{n+1}/x_{n} \right)^{r-1}}{x_{n+1}^{r-1}} \\ & = & \frac{1 - \left( 1 - (a + o(1)) x_{n}^{r-1} \right)^{r-1}}{x_{n}^{r-1} \left( 1 - (a + o(1)) x_{n}^{r-1} \right)} \\ & = & \frac{(r-1)(a + o(1)) x_{n}^{r-1} + O(x_{n}^{2(r-1)})}{x_{n}^{r-1}(1 + o(1))} \\ & = & (r-1) a + o(1). \end{eqnarray*}$$ Thus by Stolz-Cesàro Theorem, we have $$ \lim_{n\to\infty} n x_{n}^{r-1} = \frac{1}{(r-1)a}.$$

Sangchul Lee
  • 167,468
4

For every $y_0$ in $(0,1)$, $ty_t\to1$.

Here is an elementary proof. First, consider instead the sequences $(y^{(a)}_{t})$ defined by $$ y^{(a)}_{t+1}=\frac{y^{(a)}_{t}}{1+ay^{(a)}_{t}} $$ for a given positive $a$. Then $$ y^{(a)}_{t}=\frac{y^{(a)}_{0}}{1+tay^{(a)}_{0}}, $$ hence $y^{(a)}_t\to0$ and $\lim ty^{(a)}_t=1/a$.

Now, $y(1-y)\le y/(1+y)$ for every nonnegative $y$ hence, if $y^{(1)}_0=y_0$ and $y_0$ is in $(0,1)$, then $y^{(1)}_{t}\ge y_t$ for every $t$. This proves that $y_t\le y_{0}/(1+ty_{0})$, hence $\limsup ty_t\le1$.

In particular, $y_t\to0$ hence, for every positive $x<1$, one can choose an index $t_x$ such that $y_{t_x}\le x$. Let $z_x=y_{t_x}$ and $a_x=1/(1-x)$. Then $y(1-y)\ge y/(1+a_xy)$ for every $y$ in $(0,x)$, hence the iterations $y_t\to y_{t+1}$ for $t\ge t_x$ are such that $$ y_t\ge \frac{z_x}{1+(t-t_x)a_xz_x}. $$ This implies that $\liminf ty_t\ge1/a_x=1-x$ for every positive $x<1$, hence $\liminf ty_t\ge1$ and the result stated above holds for every $y_0$ in $(0,1)$.

Did
  • 279,727
2

I think it is clear that the slowest way this sequence can decay is by taking $y_0 = \frac{1}{2}$. This is because $x -x^2$ take maximum value $\frac{1}{4}$ on $(0,1)$ and $x - x^2$ is increasing on $(0,\frac{1}{2})$. Hence, whatever value of $y_0$ you choose, you have $y_ 1 \leq \frac{1}{4}$, and then $y_{2} \leq \frac{3}{16}$, $y_{3} \leq \frac{39}{256}$, etc, and by choosing $y_0 = \frac{1}{2}$, you get the unique slowest decreasing sequence.

It is worth noting that the sequence $(y_t)$ is the sequence obtained by applying Newton's method for solving $f(y) = 0$, where $f(y) = e^{\frac{-1}{y}}$ (to extend $f$ to a continuous function on $[0,1]$, we have to define $f(0) =0$, and there is clearly no other value of $y \in [0,1]$ for which $f(y) =0$).