3

I am trying to understand a question with the following Markov Chain:

enter image description here

As can be seen, the chain consists of two components. If I start at state 1, I understand that the steady-state probability of being in state 3 for example is zero, because all states 1,2,3,4 are transient. But what I do not understand is that is it possible to consider the second component as a separate Markov chain? And would it be correct to say that the limiting probabilities of the second chain considered separately exist? For example, if I start at state 5, then can we say that the steady-state probabilities of any of the states in the right Markov chain exist and are positive?

QPTR
  • 217

1 Answers1

3

Yes, you can. Actually this Markov chain is reducible, with two communicating classes (as you have correctly observed),

  1. $C_1=\{1,2,3,4\}$, which is not closed and therefore any stationary distribution assigns zero probability to it and
  2. $C_2=\{5,6,7\}$ which is closed.

As stated for example in this answer,

Every stationary distribution of a Markov chain is concentrated on the closed communicating classes.

In general the following holds

Theorem: Every Markov Chain with a finite state space has a unique stationary distribution unless the chain has two or more closed communicating classes.

Note: If there are two or more communicating classes but only one closed then the stationary distribution is unique and concentrated only on the closed class.

So, here you can treat the second class as a separate chain but you do not need to. No matter where you start you can calculate the steady-state probabilities and they will be concentrated on the class $C_2$.

Jimmy R.
  • 35,868
  • 2
    Yeah, I see that little block in bottom-right corner of transition matrix :) $$\begin{pmatrix} 0 &\frac{1}{2} & \frac{1}{2} & 0 & 0 & 0 & 0\ \frac{1}{2}& 0 & 0 & \frac{1}{2} & 0 & 0 & 0\ \frac{1}{3} &\frac{1}{3} & \frac{1}{3} & 0 & 0 & 0 & 0\ 0 &\frac{1}{2}& 0 & 0 & \frac{1}{2} & 0 & 0\ 0 & 0 & 0 & 0 & 0 & 1 & 0\ 0 & 0 & 0 & 0 & \frac{1}{2} & 0 & \frac{1}{2}\ 0 & 0 & 0 & 0 & 0 & 1 & 0 \end{pmatrix}$$ – Evgeny Dec 26 '15 at 09:55
  • @Stef Thanks so much for the answer. One more thing that I am confused about. I read that a Markov Chain can only be considered to have a limiting distribution (and are ergodic) if "all" of its states are irreducible and non-transient, so when we obtain the steady-state probabilities by solving the simultaneous equations of the system as a whole, aren't they a limiting distribution? Or is there a difference between having a limiting distribution and being ergodic? Can we have a limiting distribution without the chain being ergodic? – QPTR Dec 26 '15 at 10:08
  • 1
    When studying long-run behaviors we focus only on the recurrent classes. Limiting probabilities and stationary are different (but limiting are a subset of stationary). This is for the reason that for example the sequence ${0,1,0,1,0,1,\ldots}$ does not have a limit, but spends $1/2$ of time in state $0$ and $1/2$ of the time in $1$. So, to return to your question, the class $C_2$ is ergodic etc. but not the chain as a whole. Depending on the definition you can say that the limiting distribution is $0$ for states in $C_1$ but is better to say that you focus on $C_2$. – Jimmy R. Dec 26 '15 at 10:18