3

Let $W$ be a Brownian motion with filtration $(F_t)$. Let $\tau$ be a stopping time.

It is well-known by the strong Markov property that the law of $W_{\tau+t}-W_\tau$ given $F_\tau$ is normal with mean zero, variance $t$.

I am interested in a very minor extension of this result and I would like to prove that the law of $W_{\tau\vee t}-W_\tau$ given $F_\tau$ is normal with mean zero and variance $(\tau\vee t)-\tau$. How can I do it?

saz
  • 120,083
Zoltán
  • 33
  • Welcome to MSE. What were your attempts in proving it? And why are you interested in this question? –  Jan 23 '19 at 15:19
  • @James I tried to consider a very simple case, when $\tau$ takes only two values, but couldn't prove it. The question is natural since I discovered that in general $W_{\tau}-W_{\sigma}$ can have any distribution if $\tau$ and $\sigma$ are stopping times (it does not have to be normal given $F_\sigma$). I wonder what can we say in my case. – Zoltán Jan 23 '19 at 15:31

1 Answers1

3

If $\tau$ is a (finite) stopping time, then the process

$$B_s := B_{s+\tau}-B_{\tau}, \qquad s \geq 0$$

is a Brownian motion which is independent of $\mathcal{F}_{\tau}$. As

$$W_{\max\{t,\tau\}}-W_{\tau} = B_{\max\{0,t-\tau\}},$$

we have

$$\mathbb{E}\big(f(W_{\max\{t,\tau\}}-W_{\tau}) \mid \mathcal{F}_{\tau} \big) = \mathbb{E} \big( f(B_{\max\{0,t-\tau\}}) \mid \mathcal{F}_{\tau} \big)$$

for any bounded measurable function $f$. Since $\max\{0,t-\tau\}$ is $\mathcal{F}_{\tau}$-measurable and $(B_s)_{s \geq 0}$ is independent from $\mathcal{F}_{\tau}$, it follows (see below for details) that

$$\mathbb{E}\big(f(W_{\max\{t,\tau\}}-W_{\tau}) \mid \mathcal{F}_{\tau} \big) = g(\max\{0,t-\tau\}) \tag{1}$$ where $$g(s) := \mathbb{E}f(B_s).$$

Using this identity for $f(x) := \exp(ix \xi)$ with $\xi \in \mathbb{R}$ fixed, we get

\begin{align*} \mathbb{E} \left( \exp \left[i \xi (W_{\max\{t,\tau\}}-W_{\tau}) \right] \mid \mathcal{F}_{\tau} \right) &= \exp \left(- \frac{\max\{0,t-\tau\}}{2} \xi^2 \right) \\ &= \exp \left(- \frac{\max\{t,\tau\}-\tau}{2} \xi^2 \right) \end{align*}

which proves the assertion.


To prove $(1)$ rigorously, we can use the following property of conditional expectation (which you can find, in this particular formulation, in the book on Brownian motion by Schilling & Partzsch, Lemma A.3).

Proposition: Let $X: (\Omega,\mathcal{A}) \to (D,\mathcal{D})$ be a random variable. Assume that $\mathcal{X}$, $\mathcal{Y}$ are independent $\sigma$-algebras such that $X$ is $\mathcal{X}/\mathcal{D}$-measurable. If $\Psi: D \times \Omega \to \mathbb{R}$ is bounded and $\mathcal{D} \otimes \mathcal{Y}/\mathcal{B}(\mathbb{R})$-measurable, then $$\mathbb{E}(\Psi(X(\cdot),\cdot) \mid \mathcal{X}) = g(X)$$ for $$g(x) := \mathbb{E}(\Psi(x,\cdot)).$$

To prove $(1)$ we choose the objects as follows:

  • $D:= [0,\infty)$ endowed with the Borel-$\sigma$-algebra (restricted to $[0,\infty)$)
  • $\mathcal{X} := \mathcal{F}_{\tau}$
  • $\mathcal{Y} := \sigma(B_s, s \geq 0)$,
  • $X := \max\{0,t-\tau\}$
  • $\Psi(x,\omega) = f(B_x(\omega))$ for $x \in D=[0,\infty)$

Let's check the assumptions of the proposition: As already mentioned earlier on, the Brownian motion $(B_s)_{s \geq 0}$ is independent from $\mathcal{F}_{\tau}$, i.e. $\mathcal{X}$ and $\mathcal{Y}$ are independent. Moreover, $\tau$ is $\mathcal{F}_{\tau}$-measurable (i.e. $\mathcal{X}$-measurable) and therefore $X$ is $\mathcal{X}$-measurable. Moreover, the progressive measurability of $(B_s)_{s \geq 0}$ implies that $\Psi$ is $\mathcal{D} \otimes \mathcal{Y}/\mathcal{B}(\mathbb{R})$-measurable.

Since we have verified all assumptions, we may apply the above proposition and this gives exactly $(1)$.


Remark: The above reasoning shows, more generally, that $W_{\sigma}-W_{\tau}$ given $\mathcal{F}_{\tau}$ is Gaussian with mean $0$ and variance $\sigma-\tau$ for any stopping time $\sigma$ which is $\mathcal{F}_{\tau}$-measurable and satisfies $\sigma \geq \tau$.

saz
  • 120,083
  • Wow! Thank you so much for this very detailed proof, I will now use it as an example for my future studies! Amazing! By myself I managed to get equation (1), but didn't know how to proceed and prove it formally. Once again, thank you very very much! – Zoltán Jan 23 '19 at 17:09
  • Also do you maybe know what is going on if we don't have independence? Is it true that $E(\psi(X(⋅),⋅)∣\mathcal{X})=g(X)$, where $g(x)=E(\psi(x,⋅)∣\mathcal{X})$? In which book can I find these type of statements? – Zoltán Jan 23 '19 at 17:40
  • @Zoltán You are welcome. And no, without independence we cannot expect that a statement of this form holds. – saz Jan 23 '19 at 18:15
  • Do you maybe now a simple counterexample for this? Let's say I would like $E(f(X,Y)|Z)$ to be different from $g(X)$, where $g(x):=E (f(x,Y)|Z)$, and $X$ is $Z$-measurable (but no independence assumption on $Y$ and $Z$). Is there any simple situation where this equality breaks? – Zoltán Jan 24 '19 at 12:10
  • @Zoltán Take a look at this example – saz Jan 24 '19 at 14:40