6

Suppose I have the difference equation $x_{n+1} = f(x_n)$. The point $x^{\ast}$ is called a fixed point of the equation if $x^{\ast}=f(x^{\ast})$.

The fixed point is stable if $\,\left\lvert\, f'(x^{\ast})\right\rvert < 1$ and unstable if $\,\left\lvert\, f'(x^{\ast})\right\rvert > 1$.

This is all from my differential equations notes. But could someone give a proof of these or explain why they are true? Thanks.

Vlad
  • 6,710
PhysicsMathsLove
  • 3,142
  • 1
  • 21
  • 39

4 Answers4

1

Start with a first order difference equation as $x_t = αx_{t−1} + b$ A steady-state $x^∗$ is such that $x_t = x^∗$ at all $t$. The steady-state $x^∗$ of $x_t = αx_{t−1} + b$ is stable if given $\epsilon \gt 0$ there exists $\delta \gt 0$ such that $|x_0 − x^∗| \lt \delta \implies |x_t − x^∗| \lt \epsilon$, for all $t \gt 0$. If $x^∗$ is not stable, then it is called unstable.

link to pdf page 16

Nebo Alex
  • 1,974
0

First consider a linear difference equation $$ x_{n+1} = ax_{n} + b $$ Try to convince yourself that if $x^{*}$ is a fixed point of this system, then if you pick any initial point $y_{0}$ (no matter far away it is from $x^{*}$), if $|a| < 1$, then $\lim_{n \to \infty} y_{n} = x^{*}$. This isn't trivial, but it's not too hard. You know $y_{1} = ay_{0} + b$, $y_{2} = a(ay_{0} +b) + b$, and so on. Find a general closed form expression for $y_{n}$, take $n \to \infty$, and use the expression for the sum of a geometric series in this (you will see a geometric series in your closed for expression).

Now, we want to reduce this linear case to a general case. With relevant assumptions on smoothness, we can use a first order Taylor expansion to express $f$ as a linear function for $x$ close to $x^{*}$, and then we can apply the same argument as the linear case. That is, at $x$ near $x^{*}$, we should that $$ f(x) = f(x^{*}) + f'(x^{*})(x-x^{*}) $$ The condition $|a| < 1$ is here equivalent to $|f'(x^{*})| < 1$.

gtoques
  • 1,376
0

A fixed-point is stable when the function is contracting, i.e. the distance to the point decreases on every iteration, $$|f(x)-x^*|<|x-x^*|.$$ We consider the ratio

$$r=\frac{|f(x)-x^*|}{|x-x^*|},$$ which indicated the rate of decrease.

If it is smaller than $r$ with $r<1$ in some interval, we will see the iterates converge at least linearly because $$|f(x)-x^*|\le r|x-x^*|,$$ $$|f(f(x))-x^*|\le r|f(x)-x^*|\le r^2|x-x^*|$$ and more generally $r^k$ after $k$ iterations. Similarly, if $r>1$ the sequence can diverge.


Now the connection with the derivative is by

$$f'(x^*)=\lim_{x\to x^*}\frac{f(x)-f(x^*)}{x-x^*}=\lim_{x\to x^*}\frac{f(x)-x^*}{x-x^*}=r$$

and by definition of the limit, there is an $\epsilon$-interval around $x^*$ where the ratio is in $[|r|-\epsilon,|r|+\epsilon]$ and $|r|+\epsilon<1$.

0

An intuitive explanation:

Any smooth function can be locally approximated by a linear function

$$f(x)\approx b+a(x-x^*)$$ where $b=f(x^*)$ and $a=f'(x^*)$. When $x^*$ is a fixed-point of the equation $x=f(x)$, we also have $b=x^*$.

So the iterations are approximately

$$x\to x^*+a(x-x^*)\to x^*+a^2(x-x^*)\to x^*+a^3(x-x^*)\to\cdots$$

It is obvious that these iterations converge or diverge depending on the value of $|a|$ compared to $1$.