I am wondering for quite some time: Can any state from a stochastic process be converted into a Markov state?
I was reading this question about stochastic processes which states do not have the Markov property. The accepted answer gives an example where an urn contains two red and one green ball that are taken sequentially. Without replacement future states clearly depend on historical states. But what if I add two new properties to the states storing the number of red and green balls still in the urn. Such a state would be Markov because the future will only depend on the present state.
Another example would be flying objects. If states of such a system only include positions, the future would depend on historical velocities too. But if all velocities and accelerations would be included in the states, you should have a Markov state.