Here is a good question that highlights the vagueness of the definition of a random walk: it can be defined to be a Markov chain on a state space that has some notion of addition, e.g. $\mathbb{Z}^n$, so that $X_t$ is a sequence of partial sums.
However a random walk on a graph cannot be described as a sequence of partial sums. It makes more sense to describe a random walk as a discrete-time Markov chain that allows movement to "adjacent" states only. Though, it's not clear how to describe adjacent states in general.
How would one define a random walk in general? E.g. on $\mathbb{R}^n$? On a state space of functions? One example I can think of is a posterior update, e.g. the coin toss has posterior Beta($a_0 + \text{#heads}, b_0 + \text{#tails}$) where Beta($a_0,b_0)$ is the prior. It would make sense to describe the Markov chain $X_n = \text{Beta}(a_n,b_n)$ as a random walk since either $a_n$ or $b_n$ increases at each time.
Furthermore, would we assume any random walk is time-homogeneous and has independent increments?