3

I am wondering for quite some time: Can any state from a stochastic process be converted into a Markov state?

I was reading this question about stochastic processes which states do not have the Markov property. The accepted answer gives an example where an urn contains two red and one green ball that are taken sequentially. Without replacement future states clearly depend on historical states. But what if I add two new properties to the states storing the number of red and green balls still in the urn. Such a state would be Markov because the future will only depend on the present state.

Another example would be flying objects. If states of such a system only include positions, the future would depend on historical velocities too. But if all velocities and accelerations would be included in the states, you should have a Markov state.

1 Answers1

1

I found this video where Emma Brunskill says by including the history into a state the state becomes Markov.

This corresponds to the two examples too. By storing the number of red and green balls still in the urn we store the history of the predecessor state and by that the states become Markov. For the flying object a part of the historical states could be included so that actual velocities and accelerations can be calculated for any state. So it is not said that always the full history needs to be stored to create a Markov state!

However, further answers are welcome!

  • 1
    I think the main counterexamples would be cases where the full history is needed, and contains an infinite number of states (such that the augmented state to be propagated would be infinite-dimensional). Take a Gaussian process with a Gaussian covariance kernel for example. – S.Surace Mar 04 '18 at 06:07