1

Let $E$ be an Abelian group. Let $X$ be a right-continuous process with values in $(E,\mathcal{E})$ (where $\mathcal{E}$ denotes the $\sigma$-algebra on $E$), defined on $\Omega, \mathcal{F}_t,P)$. Suppose that $X$ has stationary, independent increments.

I now want to show the following :

i) $X$ is Markov process with initial distribution $P(X_0 \in \cdot)$.

ii) Let $\tau$ be a finite $(\mathcal{F}_t)_t$-stopping time. Then the process $X(\tau) = (X_{\tau + t} - X_\tau)_{t \geq 0}$ is independent of $\mathcal{F}_\tau$. it is a Markov process, adapted to the filtration $(\mathcal{F}_{\tau+t})_t$. The distribution $P_\tau$ of $X(\tau)$ is the same as the distribution of $X- X_0$ under $P_0$.

Yet I don't know how to start proving i) for ii) however we can put $Y_t = X_{\tau+t} - X_\tau$, $t \geq 0$. For $t_1<\ldots < t_n$ and functions $f_1,\ldots, f_n \in b \mathcal{E}$ (bounded functions) we have \begin{align*} \mathbb{E}_\nu \left( \prod_k f_k (Y_{t_k}) \mid \mathcal{F}_\tau \right) &= \mathbb{E}_\nu \left( \prod_k f_k (X_{\tau+t_k} - X_\tau) \mid \mathcal{F}_\tau \right) \\ &= \mathbb{E}_{X_\tau} \left( \prod_k f_k (X_{t_k} - X_0)\right) \end{align*} $P_\nu$-a.s., by the strong Markov property. As consequence the proof is complete once we have shown that for an arbitrary $x\in E$ $$ \mathbb{E}_x \left( \prod_{k=1}^n f_k(X_{t_k} - X_0)\right) = P_{t_1}f_1 \cdots P_{t_n-t_{n-1}} f_n(0),$$ which is the characterisation of a Markov process. I think I need to use induction to prove this, yet don't know the details.

Could anyone help me with i) and help me with details of ii). Many thanks!

1 Answers1

1

For part (i), we don't even need right-continuity or stationary increments from $X$. Notice that if $t>s$, then for any (bounded-measurable) function $f: E \to \mathbb{R}$, we can write $$E[f(X_t)|\mathcal{F}_s] = E[f(X_t-X_s+X_s)|\mathcal{F}_s] = E[g(X_t-X_s,X_s)|\mathcal{F}_s]$$ where $g(x,y) = f(x+y)$. Now, we know that $X_s$ is $\mathcal{F}_s$-measurable and $X_t-X_s$ is independent of $\mathcal{F}_s$. Therefore, using this question (Conditional Expectation of Functions of Random Variables satisfying certain Properties; see the comment below the question), we know that $$E[g(X_t-X_s,X_s)|\mathcal{F}_s] = E[g(X_t-X_s,X_s)|X_s] = E[f(X_t)|X_s]$$It follows that $(X_t)_t$ is a Markov Process. The initial distribution is clearly $P^{X_0}$.

For part (ii), I'll come back later, assuming someone else hasn't already answered...

shalop
  • 13,703
  • Thanks alot! This clarifies i) to me. –  May 15 '15 at 11:21
  • Do you have any clue how to solve the second part? –  May 18 '15 at 14:37
  • @Rodel: I'm not so sure about it, but one idea might be to approximate $\tau$ from above using simple stopping times, i.e, stopping times of the form $\tau_n:=\sum_{k=1}^{4^n} k2^{-n} \cdot 1_{{(k-1)2^{-n} < \tau \leq k2^{-n}}}$. You could also ask it as a separate question. There are some very knowledgable people like saz or Did who could probably tell you the answer in a heartbeat. – shalop May 18 '15 at 17:32