Reading the Wikipedia page for Random Walk, I was wondering what is the definition for a general random walk as a random process, so that all the concepts such as random walk on $\mathbb{Z}^d$, Gaussian random walk and random walk on a graph can fall into this general definition?
(1) In Ross's book for stochastic process, he defined a random walk to be the sequence of partial sums of a sequence of i.i.d. real-valued random variables.
So is a Gaussian random walk the sequence of partial sums of a sequence of i.i.d. real-valued random variables with a common Gaussian distribution?
How does a random walk on a graph fit into Ross's definition?
(2) I was thinking maybe the concept of random walk is equivalent to that of discrete-time Markov process, where the state space can be either continuous or discrete or even non-numeric (such as for a random walk on a graph)? Is it correct?
So can I say the graph in "a random walk on a graph" is just the graphical model for the distribution of the random walk as a stochastic process?
Also the graph in "a random walk on a graph" has nothing to do with random graph, right?
(3) if you can come up with a definition for random walk, please also explain how various special kinds of random walks fit into the definition you provided.
Thanks and regards!
EDIT:
Thanks to Dr Schmuland for the reply!
Although I have chosen the best answer, I am still not clear about the following questions:
(1) Is a random walk (including random walk on a group such as $\mathbb{Z}^d$ and random walk on a graph) equivalent to a discrete-time Markov process? If not, what makes random walks different from general discrete time Markov processes?
(2) For a random walk defined on a group such as $\mathbb{Z}^d$, does it really require the increments between every two consecutive indices be i.i.d. (seems to be the definition in Ross's Stochastic processes) or just independence will be enough and identical distribution is not necessary?
Thanks!
Reply to Update of Dr Schmuland
Thanks, Dr Schmuland, for providing a big picture that I have not clearly realized before! It took me some while to understand. Learning more makes me feel better.
Generally when the state space is additive group, are these two properties - markov property and increment-independence - equivalent? Must time-homogeneity be defined only for Markov process? If it can be define for a general process, is it equivalent to the property that the increments over same length period have the same distribution?
I also wonder if, for a random walk on a graph, at each vertex, the transition probability must be uniform? If yes, is the transition probability uniform at a vertex over a set of the neighbours and itself or just over a set of its neighbours (itself excluded, which means it is impossible to stay at the vertex)?
As your last exercise, I think time-homogeneity and identical-increment-distribution are equivalent, and Markov property and increment-independence are equivalent, so every time-homogeneous Markov chain will be a random walk on the group {1,2,3}. But my answer will be different for it to be a random walk on the graph derived for {1,2,3} as you mentioned, the transition probability from a vertex to other vertices has to be uniform.