I looked for a formal definition of Markov chain and was confused that all definitions I found restrict chain's state space to be countable. I don't understand purpose of such a restriction and I have feeling that it does not make any sense.
So my question is: can state space of a Markov chain be continuum? And if not then why?
Thanks in advance.