I am trying to explain the differences between the following concepts to someone and I realized I myself am super confused:
Continuous/discrete Markov Process
Continuous/Discrete Markov chains
Markov property : $\mathrm{P}\{X_n=i|X_{n−1}=j,X_{n−2}=k,...\}=\mathrm{P}\{X_n=i|X_{n−1}=j\}\mathrm{P}\{X_n=i|X_{n−1}=j,X_{n−2}=k,...\}=\mathrm{P}\{X_n=i|X_{n−1}=j\} ?$
I used to think: Every process that has Markov property is a Markov Process. Every Markov process is a Markov chain and every Markov chain is a Markov process.
But it seems crazy now when I think about it, because if they are all the same, why there are different names for them?
And they are continuous (discrete) if their parameter set TT is continuous (discrete) regardless of their state space?
I want to start with homogeneous Markov chain and process too. But since I am already too confused and Wikipedia is making me more confused, I prefer to wait till I get these basic definitions straight first (any nice analogy that can be useful to teach them to others would be highly appreciated too if any teacher here knows any.).
Thanks a lot