8

A Markov Chain on states 0,1,..... has transition probabilities

$P_{ij}=1/(i+2)$ for j=0,1,....,i,i+1.

I'm supposed to find the stationary distribution.

So do I take the limit as n goes to infinity of 1/n multiplied by $P_{ij}$?

zoli
  • 20,452
  • Something is not clear to me. The $P{ij}$ probabilities should add to 1 as $j$ goes from $0$ to $n$. – zoli Mar 02 '15 at 02:40
  • @zoli: it does add up to 1, assuming it's transition from state $i$ to $j$. – Alex R. Mar 02 '15 at 03:47
  • To find the stationary distribution, you need to solve the stationary distribution equation: $\pi P = \pi$. Have you tried this? – Alex R. Mar 02 '15 at 03:47
  • OP: Oh! yes, I didn't see the index limit. But you forgot to put there: zero otherwise. But who cares. – zoli Mar 02 '15 at 04:40

1 Answers1

9

In order to get the stationary probabilities of the Markov process at stake we have to solve the following equation: $$[x_1\ x_2\ x_3 \ x_4 \ x_5...] \left[\begin{matrix}\frac{1}{2}&\frac{1}{2}&0&0&0&0&\dots\\ \frac{1}{3}&\frac{1}{3}&\frac{1}{3}&0&0&0&\dots\\ \frac{1}{4}&\frac{1}{4}&\frac{1}{4}&\frac{1}{4}&0&0&\dots\\ \frac{1}{5}&\frac{1}{5}&\frac{1}{5}&\frac{1}{5}&\frac{1}{5}&0&\dots\\ \vdots \end{matrix}\right]=\left[\begin{matrix}x_1\\ x_2\\ x_3\\ x_5\\ x_4\\ \vdots\end{matrix}\right]^T,$$ that is, we have to solve the following system of linear equations $$\begin{matrix} x_1=&\frac{1}{2}x_1+&\frac{1}{3}x_2+&\frac{1}{4}x_3+&\frac{1}{5}x_4+&\frac{1}{6}x_5...\\ x_2=&\frac{1}{2}x_1+&\frac{1}{3}x_2+&\frac{1}{4}x_3+&\frac{1}{5}x_4+&\frac{1}{6}x_5...\\ x_3=&0+\ \ &\frac{1}{3}x_2+&\frac{1}{4}x_3+&\frac{1}{5}x_4+&\frac{1}{6}x_5...\\ x_4=&0+\ \ &0+\ \ &\frac{1}{4}x_3+&\frac{1}{5}x_4+&\frac{1}{6}x_5...\\ x_5=&0+\ \ &0+\ \ &0+\ \ &\frac{1}{5}x_4+&\frac{1}{6}x_5...\\ \vdots \end{matrix}$$ From here we can see that $$x_2=x_1=c\gt0$$ and $$x_{n+1}=x_n-\frac{1}{n}x_{n-1}\text{ if }n>2$$ where $c$ is undefined yet.

This sequence, generated by the recursion above, tends to zero extremely fast. My guess is that for $$c=\frac{1}{e}$$ the sum of our numbers is $1$. That is if $c=1/e$ then we have the stationary distribution of the Markov process given by the OP.

Unfortunately I was not able to prove that the sum of our numbers is $e$ if $c=1.$ So I've posted the relating question. (The answer is available on MSE).

That is, we are done.

An interesting fact

Let's build a Markov automaton working according to the state transition matrix given here. No, it would not be that hard... It can even be told by words how this machine works. If the machine is in state one then it stays there with probability 1/2 and goes to the next state with the same probability. If in the second state then the machine either goes back to the first state or stays or jumps ahead to the third state; all these three possibilities are equally likely. Now if the machine is in the ${n}^{th}$ state then it will either go back to the first state or to the second state or to ... or stays or jumps ahead to the ${(n+1)}^{st}$ state.

Now, let's unleash the machine in one of its states. Then the relative frequency of the event that the machine visits the first state will approach $\frac{1}{e}.$ I've calculated the probability that the machine is wandering somewhere beyond its $15^{th}$ state. This probability is less than $3.0007\times{10}^{-13}.$

zoli
  • 20,452