4

Is it possible for a reducible markov chain to have a unique stationary distribution. Consider e.g. the markov chain with transition matrix below

$$A= \begin{pmatrix} 1 & 0 & 0 \\\ 0.2 & 0.7 & 0.1 \\\ 0.3 & 0.3 & 0.4 \end{pmatrix}$$ I believe A is a reducible markov chain ({1} and {2,3} are two distinct classes of states). Once we visit state 1 we are stuck with state 1. However if we try to calculate its unique distribution it comes as pi_s = [1 0 0]. And if we start with any arbitrary probability distribution, after a while it seems we will converge to pi_s. I am not sure am I missing something?

unknown123
  • 43
  • 1
  • 3

1 Answers1

4

Indeed this Markov chain is reducible, with two communicating classes, and the communicating class {1} is closed while the communicating class {2,3} is not. Every stationary distribution of a Markov chain is concentrated on the closed communicating classes, in the present case, only state 1 is in a closed communicating class. This proves without computation that the stationary distribution is unique and is the Dirac distribution on state 1.

Did
  • 279,727
  • @Did can you please recommend a source for concentration of stationary distributions on closed communicating classes? Thank you – Sai Jun 14 '14 at 21:44
  • 1
    @Sai Any textbook on finite Markov chains. Norris' Markov chains is available online. – Did Jun 15 '14 at 06:20
  • @Did thank you very much. Could you please direct me more specifically? I couldn't find the statement. – Sai Jun 15 '14 at 22:11
  • 1
    @Sai One ends up leaving any non closed communicating class $C$ hence, for every initial measure $m$ and every state $i$ in $C$, $(mp^n)_i\to0$. If $m$ is stationary, $m=mp^n$ for every $n$ hence $m_i=0$. This proves that $m(C)=0$ for every non closed communicating class $C$ and every stationary distribution $m$. – Did Jun 16 '14 at 07:33