5

I'm not looking for the difference in the mathematical definition, but rather for an intuitive explanation of their differences and possible examples, so that I can have them in my head when solving/formulating problems.

Arturo
  • 389
  • This may help: http://math.stackexchange.com/questions/27789/how-to-characterize-recurrent-and-transient-states-of-markov-chain –  Oct 22 '13 at 12:59

1 Answers1

6

A Markov chain is called recurrent if it returns back in a finite time with probability 1. That means you can always expect it evolves to its origin. However, this cannot guarantee that the mean time of return is also finite. If it is, then the chain is positive-recurrent, otherwise null-recurrent. I will first give a example and then explain why.

I can always construct a Markov chain such that $n$ step first return probability $p_n=\frac6{(\pi n)^2}$. So to verify the state is recurrent, we compute $$\sum_{n=1}^\infty p_n=\sum_{n=1}^\infty\frac6{(\pi n)^2}=1$$ And to test whether the state is postive-recurent or null-recurrent, we compute the mean return time $$\mathbb E[T_n]=\sum_{n=1}^\infty np_n=\frac6{\pi^2}\sum_{n=1}^\infty\frac1n =\infty$$

We see although a state can return in finite time with probability 1, mean return time may be infinite. The reason is that "mean" is bad. When $n$ is so large enough that first return probability is minute, we can almost ignore those cases. Namely, Actually the state returns almost in small steps. Unfortunately, the mean is too cautious to overlook those very low probability. Multiplied by very large number of step, those terms become fat and eventually lead infinite mean return time.

Shuchang
  • 9,800
  • 2
    Could you explain how to construct the chain from your example? – Leo Jun 13 '14 at 03:53
  • @Leo Consider a Markov chain on the non-negative integers. When at state $n>0$, the chain deterministically transitions to state $n-1$. When at state $0$, the chain transitions to state $n$ with probability $p_n$. – A. Howells Feb 15 '23 at 03:13