9

I looked for a formal definition of Markov chain and was confused that all definitions I found restrict chain's state space to be countable. I don't understand purpose of such a restriction and I have feeling that it does not make any sense.

So my question is: can state space of a Markov chain be continuum? And if not then why?

Thanks in advance.

  • Since the state transitions themselves are countable, the state cannot enter more than a countable number of different states. So perhaps no generality is gained from having an uncountable state space. – MJD May 30 '12 at 14:16
  • @MarkDominus As I understand this, if I'm walking along the street I'm not allowed to model this in terms of Markov chain because world coordinates are continuous rather than discrete. – user32541 May 30 '12 at 14:24
  • I don't understand how you could model such a thing as a Markov process. What would the state transition be? You would need to have a continuous transition between states, and Markov processes are all about discrete transitions. – MJD May 30 '12 at 14:29
  • If I make a step and one and another and each step I'm in this or this point according to some probability distribution. Yep, I need continuous transition in this case, but it's much like Markov chain and I'm wondering whether it can be called Markov chain. – user32541 May 30 '12 at 14:41
  • "restrict chain's state space to be uncountable" -> sounds confusing. should "to be countable"? – leonbloy May 30 '12 at 14:49
  • aren't you thinking of a Markov process, rather than a Markov Chain? "Often, the term Markov chain is used to mean a Markov process which has a discrete (finite or countable) state-space" http://en.wikipedia.org/wiki/Markov_process – leonbloy May 30 '12 at 14:50
  • @leonbloy Thanks, fixed this typo. – user32541 May 30 '12 at 14:52
  • @leonbloy Looks like you've hit the spot. It was the Wikipedia article on particle filters that confused me. " Particle filters are usually used to estimate Bayesian models in which the latent variables are connected in a Markov chain — similar to a hidden Markov model (HMM), but typically where the state space of the latent variables is continuous rather than discrete" -- http://en.wikipedia.org/wiki/Particle_filter – user32541 May 30 '12 at 14:57
  • @leonbloy Now when I know that Markov chain is just a specialization of Markov process all it sounds much more reasonable. Could you repost your comment with excerpt from Wikipedia as an answer so that I could mark it? – user32541 May 30 '12 at 15:00
  • It's quite a question where should we stop using "Markov Chain" and start using "Markov process" - see for example this discussion – SBF May 30 '12 at 15:38

2 Answers2

8

It seems you are thinking of a Markov process, rather than a Markov Chain. The Markov chain is usually defined as a Markov process that has a discrete (finite or countable) state space.

"Often, the term Markov chain is used to mean a Markov process which has a discrete (finite or countable) state-space" (ref)

A Markov process is (usually) a slightly more general thing, it's only required to exhibit the Markov property, which makes sense for uncountable states (and "time"), or more general supports. So, for example, the continuous random walk is normally considered a Markov process, but not a Markov Chain.

leonbloy
  • 63,430
7

Yes, it can. In some quarters the "chain" in Markov chain refers to the discreteness of the time parameter. (A notable exception is the work of K.L. Chung.) The evolution in time of a Markov chain $(X_0,X_1,X_2,\ldots)$ taking values in a measurable state space $(E, {\mathcal E})$ is governed by a one-step transition kernel $P(x,A)$, $x\in E$, $A\in{\mathcal E}$: $$ {\bf P}[ X_{n+1}\in A|X_0,X_1,\ldots,X_n] = P(X_n,A). $$ Two fine references for the subject are Markov Chains by D. Revuz and Markov Chains and Stochastic Stability by S. Meyn and R. Tweedie.

  • Thanks for your reply, but it seems to be too complicated for me. I can't understand all that sigma-algebras =) Thanks for refs too. – user32541 May 30 '12 at 20:53