9

I'm still trying to figure out definitions and properties of random walks on $\mathbb Z^d$. My goal is to work up to understanding some large deviation principles for the local times of such random walks, but I'm having quite some trouble with the basics.

Let $(X_t)_{t\geq0}$ be a simple random walk on $\mathbb Z^d$ in continuous time. So the process starts in some point $x \in \mathbb Z^d$ at time $0$ and after a waiting time (exponentially distributed with parameter $1$) it jumps to each of its $2d$ neighbours with equal probability. $\mathbb P_x$ and $\mathbb E_x$ denote probability and expectation assuming the random walk starts in $X_0=x\in \mathbb Z^d$ at time $t=0$.

Next, the generator of a random walk is introduced as an operator on the space $\mathbb R^{\mathbb Z^d}$ of functions from $\mathbb Z^d$ to $\mathbb R$:

$$\Delta f(x) = \sum_{y:\ |x-y|=1} \left[f(y)-f(x)\right]$$ for $x \in \mathbb Z^d$ and $f \in \mathbb R^{\mathbb Z^d}$.

Not to mention that I don't understand meaning and significance of this operator, my main problem right now is that I don't understand why this equality holds:

$$\Delta f(x) = \left. \frac{\partial}{\partial t} \right|_{t=0} \mathbb E_x \left[f(X_t) \right]$$ for all $x \in \mathbb Z^d$ and all bounded functions $f: \mathbb Z^d \rightarrow \mathbb R$.

I'm stuck with the integral

$$\mathbb E_x \left[f(X_t) \right] = \int f(X_t)\ \mathrm d\mathbb P_x = \int_{\mathbb Z^d} f\ \mathrm d\mathbb P_x \circ X_t^{-1}.$$

Basically this is just an integral over a discrete space, i.e. a sum, and I should be able to evaluate this as the distribution $\mathbb P_x \circ X_t^{-1}$ of $X_t$ is known, but I'm having trouble to do the calculation.

Can someone drop me a hint how to start?

saz
  • 120,083
Amarus
  • 1,697

1 Answers1

11

Denote by $(\tau_j)_{j \in \mathbb{N}}$ the sequence of independent exponentially distributed (with parameter $1$) random variables. If we set

$$N_t := \sum_{j=1}^{\infty} 1_{\{\tau_j \leq t\}}$$

then $N_t$ describes the number of jumps of $(X_t)_{t \geq 0}$ up to time $t$. It is well-known that $(N_t)_{t \geq 0}$ is a Poisson process (with intensity $1$); in particular we have

$$\mathbb{P}^x(N_t=0)=e^{-t} \qquad \mathbb{P}^x(N_t = 1) = t e^{-t} \qquad \mathbb{P}^x(N_t \geq 2)=1 - (1 + t) e^{-t}. \tag{1}$$

Now fix $x \in \mathbb{R}^d$ and denote by $Z$ a random variable (independent of $(\tau_j)_{j \in \mathbb{N}}$) such that

$$\mathbb{P}^x(Z=y) = \begin{cases} \frac{1}{2d} & \text{if} \, |x-y| = 1, \\ 0, & \text{otherwise} \end{cases}. \tag{2}$$

Then $X_t$ equals in distribution (with respect to $\mathbb{P}^x$)

$$x \cdot 1_{\{N_t=0\}} + Z 1_{\{N_t=1\}} + X_t 1_{\{N_t \geq 2\}}$$

(that's exactly how the simple random walk is defined!). Consequently, we get

$$\begin{align*} \mathbb{E}^x f(X_t) &= f(x) \mathbb{P}(N_t = 0) + \mathbb{E}^x(f(Z) 1_{\{N_t=1\}}) + \mathbb{E}^x(f(X_t) 1_{\{N_t \geq 2\}}) \\ &= f(x) \mathbb{P}^x(N_t = 0) + \mathbb{E}^x(f(Z)) \mathbb{P}^x(N_t=1) + \mathbb{E}^x(f(X_t) 1_{\{N_t \geq 2\}}) \tag{3} \end{align*}$$

for all bounded measurable functions $f$. Hence,

$$\begin{align*} &\quad \frac{d}{dt} \mathbb{E}^x f(X_t) \bigg|_{t=0} \\ &= \lim_{t \to 0} \frac{\mathbb{E}^xf(X_t)-f(x)}{t} \\ &\stackrel{(3)}{=} \lim_{t \to 0} \frac{1}{t} \left[ (\mathbb{P}^x(N_t=0)-1) f(x) + \mathbb{E}^x(f(Z)) \mathbb{P}^x(N_t=1) + \mathbb{E}^x(f(X_t) 1_{\{N_t \geq 2\}}) \right] \tag{4} \end{align*}$$

We consider the three terms at the right-hand side separately. By $(1)$, we have

$$\lim_{t \to 0} \frac{1}{t} (\mathbb{P}^x(N_t=0)-1) f(x) = - f(x).$$

On the other hand, it follows from $(1)$ and $(2)$ that

$$\mathbb{E}^xf(Z) = \frac{1}{2d} \sum_{|y-x| =1} f(y)$$

and

$$\lim_{t \to 0} \frac{1}{t} \mathbb{P}^x(N_t=1) = 1.$$

Finally, for the last term we note that

$$\frac{1}{t} |\mathbb{E}^x f(X_t) 1_{\{N_t \geq 2\}}) \leq \|f\|_{\infty} \frac{1}{t} \mathbb{P}^x(N_t \geq 2) \xrightarrow[t \to 0]{(1)} 0.$$

Plugging this into $(4)$, we conclude

$$ \frac{d}{dt} \mathbb{E}^x f(X_t) \bigg|_{t=0} = -f(x) + \frac{1}{2d} \sum_{|y-x|=1} f(y) = \frac{1}{2d} \sum_{|y-x|=1} (f(y)-f(x)).$$

Regarding relevance and importance of the generator see e.g. this question.

Ben
  • 767
saz
  • 120,083
  • 1
    Thank you so much, I couldn't have imagined a more thorough and clear answer. – Amarus Feb 17 '16 at 20:20
  • @Amarus You are welcome. :) – saz Feb 18 '16 at 06:04
  • I have two more questions: 1) Your answer suggests that the "proper" definition of the generator should include the transition probability $1/2d$ as a factor. Is there any particular reason why this was omitted here or should be omitted? 2) If I consider an arbitrary symmetric random walk with transition probabilities $p_{xy}=p_{xy}$. Only the definition of $Z$ changes and we must replace $1/2d$ by $p_{xy}$ in the sum. This suggests that the generator in this case should be defined as $\Delta f(x)=\sum_{y:\ y\sim x} p_{xy} [f(y)-f(x)]$. Am I right to assume that? – Amarus Feb 18 '16 at 14:20
  • 1
    @Amarus 1) No, as far as I can see there is no reason why it should be omitted. In my oppinion, it is simply not correct if we omit the factor $\frac{1}{2d}$ 2) Yes, that's correct. – saz Feb 18 '16 at 16:35