1

Suppose we want to calculate $n$-step transition probability of a Markov chain conditioned on the fact that it does not pass through some particular state. Can I do this by removing that state from the Markov chain (after appropriate transfer of probability) and then calculating transition probability in this new matrix ?

  • See Did's answer here: http://math.stackexchange.com/questions/61471/what-does-it-mean-to-observe-a-markov-chain-after-a-certain-kind-of-transition –  Nov 03 '13 at 19:26
  • The question is ambiguous. Let $(X_k)$ denote the original Markov chain and $T$ the first hitting time of the particular state. Do you ask for the distribution of $(X_k){0\leqslant k\leqslant n}$ conditionally on $T\gt n$, or for the limit of the distribution of $(X_k){0\leqslant k\leqslant n}$ conditionally on $T\gt N$ when $N\to\infty$? – Did Nov 04 '13 at 13:44

1 Answers1

1

The problem is that "appropriate transfer of probability" is non trivial. Consider the markov chain with three states, $A,B,C$ such that $P(A \to B) = .2$, $P(A \to A) = .8$, $P(B \to B) = .1$, $P(B \to B) = .1$, $P(B \to C) = .9$, $P(C \to C) = .1$, and $P(C \to B) = .9$ with all other probabilities zero.

If you want to compute, say, the 4-step transition probability of this chain conditioned on not passing through C, it is not enough to simply recompute the transition probabilities starting at $B$ to remove $C$, since conditioning means is now less likely for any path to have gone through $B$ also.

Philip JF
  • 155