12

Given a continuous function $f:[0,1]\rightarrow\mathbb{R}$, $f(0)=0$, how can one show that $P(\underset{0\leq t\leq1}{\sup}\left|B_{t}-f(t)\right|<\varepsilon)>0$, where $P$ is the probability measure under which $(B_{t})_{t\geq0}$ is a standard Brownian Motion.

Any help would be much appreciated.

  • Hello and welcome on Math.SX ! Your question is interesting but lacks some background (we need to know what you know to answer it). You should also show your efforts to answer it, what you tried that didn't work. Please have a look at http://math.stackexchange.com/help/how-to-ask – Tom-Tom Dec 12 '14 at 09:43
  • 3
    By the way, this beautiful result by Paul Lévy is sometimes called the Forgery Theorem. – binkyhorse Dec 12 '14 at 10:28
  • Do you assume that $f(0) = 0$ or shall it hold for at least some $\varepsilon$? Otherwise that does not seem to be true. – SBF Dec 12 '14 at 10:34
  • Yes, f(0)=0 or BM started at f(0). Many thanks for the comments! – ProbStudent Dec 12 '14 at 10:44
  • @binkyhorse Where do you find the terminus Forgery Theorem? I don't find it by this name. – Falrach Aug 02 '19 at 17:34

4 Answers4

11

Let $f: [0,1] \to \mathbb{R}$ be a continuous function. Since $[0,1]$ is compact, $f$ is uniformly continuous on $[0,1]$, i.e. we can choose $n \in \mathbb{N}$ such that

$$|f(s)-f(t)| < \frac{\varepsilon}{2} \quad \text{for all $|s-t| \leq \frac{1}{n}$.}$$

If we set $t_j := j/n$ for $j=0,\ldots,n$, then

$$\begin{align*} \mathbb{P} \left( \sup_{0 \leq t \leq 1} |B_t-f(t)| <\varepsilon\right) &\geq \mathbb{P}\left( \forall j=0,\ldots,n-1: \sup_{t \in [t_j,t_{j+1}]} |B_t-f(t_j)| < \frac{\varepsilon}{2 (n-j)} \right) \\ &= \mathbb{P}\left( \prod_{j=0}^{n-1} 1_{A_j} \right) \end{align*}$$

for

$$A_j := \left\{\sup_{t \in [t_j,t_{j+1}]} |B_t-f(t_j)| < \frac{\varepsilon}{2(n-j)} \right\} \in \mathcal{F}_{t_{j+1}}.$$

It follows from the Markov property (of Brownian motion) and tower property (of conditional expectation) that

$$\begin{align*} \mathbb{P}\left( \prod_{j=0}^{n-1} 1_{A_j} \right) &= \mathbb{P} \left[ \left(\prod_{j=0}^{n-2} 1_{A_j} \right) \mathbb{P}^{B_{t_{n-1}}} \left(\sup_{t \in [0,1/n]} |B_t-f(t_{n-1})| < \frac{\varepsilon}{2} \right) \right]. \end{align*}$$

It suffices to show that

$$\mathbb{P}^x \left( \sup_{t \in [0,1/n]} |B_t-f(t_{n-1})| < \frac{\varepsilon}{2} \right)>c>0 \tag{1}$$

for all $x \in B(f(t_{n-1}),\varepsilon/4)$. (Then we can iterate the procedure and obtain the desired lower bound.) To this end, we note that

$$\begin{align*} \mathbb{P}^x \left( \sup_{t \leq 1/n} |B_t-f(t_{n-1})| < \frac{\varepsilon}{2} \right) &= \mathbb{P} \left( \sup_{t \leq 1/n} |B_t+x-f(t_{n-1})| < \frac{\varepsilon}{2} \right) \\ &\geq \mathbb{P} \left( \sup_{t \leq 1/n} |B_t| < \frac{\varepsilon}{4} \right) \end{align*}$$

for all $x \in B(f(t_{n-1}),\varepsilon/4)$. As $M_{1/n} := \sup_{t \leq 1/n} B_t \sim |B_{1/n}|$ (by the reflection principle), $(1)$ follows.


Remark: The asymptotics of the probability $\mathbb{P} \left( \sup_{t \in [0,1]} |B_t-f(t)| < \varepsilon \right)$ as $\varepsilon \to 0$ is subject of so-called small ball estimates.

saz
  • 120,083
  • @AreaMan I have rewritten the proof; I hope it is clearer now. – saz May 05 '17 at 07:43
  • Thanks! I think it's clear now. Will take another look in the morning with fresh eyes. – Elle Najt May 05 '17 at 07:52
  • 2
    (a year later) I'm still skeptical. In particular, I tried to work out your iteration. Let's take $f = 0$ for simplicity (its anywaythe crucial case because of Girsanov's theorem gives us the rest). Then the last probability we want to compute in the iteration is $P( sup_{ t \leq 1/n } |B_t| < \epsilon / 2n )$. I tried to bound this with Doobs, but I got $2n E |B_{1/n} | / \epsilon = (2/ \epsilon) E |B_1| $, which doesn't work for small $\epsilon$. You mention that you use the reflection principle. Can you elaborate? – Elle Najt May 04 '18 at 04:13
  • 1
    I started another question here which is about this particular case: https://math.stackexchange.com/questions/2765734/how-to-show-that-p-sup-0-leq-s-leq-1-b-s-leq-epsilon-0-for-any – Elle Najt May 04 '18 at 04:23
  • @AreaMan Note that the distribution of $M_t := \sup_{s \leq t} B_s$ and $m_t := \inf_{s \leq t} B_s$ is known; in fact $M_t \sim |B_t|$ and $m_t \sim -|B_t|$. (This is a direct consequence of the reflection principle which you can find for instance in Schillings book on Brownian motion.) Using this it shouldn't be much a problem to find the proper bound; note that $n \in \mathbb{N}$ and $\epsilon>0$ are fixed numbers! – saz May 04 '18 at 13:54
  • 1
    according to the proof, I can see that $\mathbb{P}^{B_{t_{n-1}}}(\sup_{t \in [t_{n-1},1]}|B_t-f(t_{n-1})|<\frac \epsilon 2)>1_{B(f(t_{n-1}),\epsilon/4)}(B_{t_{n-1}})c$, but how should I iterate the procedure? Also, can the set $A_1$ and $A_2$ be disjoint? since the range of $B_{t_2}$ restricted by these 2 sets may be disjoint. – allen i Jun 28 '21 at 16:45
  • I don't think that this proof works as it is written. In the induction you don't want to approximate the constant function $f(t_j)$ but the piecewise linear function $s\mapsto sf(f_{j+1})+(1-s)f_j$ (in fact, you can just assume that $f$ itself is piecewise linear). More specifically, you must define $A_j:={\sup_{[0,1/n]}|B_{t_j+s}-(sf(t_{j+1})+(1-s)f(t_j))|<\epsilon/2(n-j)}$ and then show that given a line $s\mapsto ms+q$ with $|q|<\epsilon/2(n-j+1)$, then $\mathbb{P}(\sup_{[0,1/n]}|B_s-(ms+q)|<\epsilon/2(n-j))>0$. This can be done for example using Girsanov and that you already (continued) – No-one Oct 27 '22 at 21:03
  • (continued) know that for all $\sigma>0$ you have $\mathbb{P}(\sup_{[0,1/n]}|B_s|<\sigma)>0$ (as shown here https://math.stackexchange.com/questions/2765734/how-to-show-that-p-sup-0-leq-s-leq-1-b-s-leq-epsilon-0-for-any). – No-one Oct 27 '22 at 21:11
  • Why is the assumption that $B_{t_{n-1}} \in B(f(t_{n-1}, \frac{\varepsilon}{4}))$ can be made? – mathematico Jan 25 '24 at 15:35
3

Here is a proof (this is a combination of an exercise from Steele's stochastic calculus, and a lemma which I learned from Freedman's 'Brownian motion and diffusion processes'.)

It is the case that $P( \sup_{0 \leq s \leq 1} |B_s| \leq \epsilon) > 0$ for any $\epsilon > 0$. I've written a proof here: How to show that $P( \sup_{0 \leq s \leq 1} |B_s| \leq \epsilon) > 0$ for any $\epsilon > 0$?

We assume that we can write: $f(t) = \int_0^t h(s) ds$. (I don't think this is a huge assumption, morally.)

We define a stochastic process by $Z_t = B_t - \int_0^t h(s) ds$.

Call $(\Omega, F, P)$ the probability space underlying the process $(B_t)_{0 \leq t \leq 1}$.

Then Girsanov's theorem (Steele,Theorem 13.2) tells us that there is a continuous Martingale $0 < M_t$ so that under the measure $Q$ on $\Omega$ defined by $Q(A) = E_P[ 1_A M_1]$, $Z_t$ becomes a Brownian motion. That is, distribution on the path space $C[0,T]$ induced by using $Z_t$ to push forward the measure $Q$ is standard Wiener measure.

Let $A = \{ \sup_{ 0 \leq t \leq 1} | Z_t | \leq \epsilon \}$. We wish to show that $P(A) > 0$.

From the linked argument, we know that $Q(A) > 0$, because under $Q$, $Z_t$ is a Brownian motion. Since $Q(A) = E_P[1_A M_T]$, $P(A) = 0$ is impossible.

Since $A = \{ \sup_{ 0 \leq t \leq 1} | Z_t | \leq \epsilon \} = \{ \sup_{ 0 \leq t \leq 1} | B_t - f(t) | \leq \epsilon \}$, it follows that $P( \sup_{ 0 \leq t \leq 1} | B_t - f(t) | \leq \epsilon) > 0$ for any $\epsilon$.

We can actually get something better than what you asked for -- as long as we have a good enough microscope, we can find $f(t)$ up to $1/n$ accuracy for any $n$ in any Brownian motion path almost surely...

  1. Define $X_t^{(a,b)} = (b - a)^{-1/2} ( B_{a + t(b - a)} - B_a )$ for $0 \leq t \leq 1$. This is a Brownian motion on $[0,1]$, because of the Markov property and scaling.

  2. Let $a_k = 2^{-k - 1}$ and $b_k = 2^{-k}$, for $k = 0,1,2,3 \ldots$. Then the processes $X_t^{(a_k,b_k)}$ are independent. (This follows from the independent increments property of Brownian motion.)

  3. Moreover, because of the work above, $P ( \sup_{0 \leq t \leq 1} |X_t^{(a_k,b_k)} - g(t)| \leq \epsilon) = P( \sup_{0 \leq t \leq 1} |B_t - g(t) | \leq \epsilon) > 0$, for any $\epsilon$.

  4. Name $A_k = \{ \sup_{0 \leq t \leq 1} |X_t^{(a_k,b_k)}| \leq \epsilon \}$. These are independent, and from 4.,the all have the same probability. Hence, $\Sigma P(A_k) = \infty$. From the Borel-Cantelli lemma, we know that $a.s$ infinitely many of the $A_k$ occur. In particular, one of them occurs. (But maybe one would like to observe that they occur arbitrarily close to time zero, so any moment of time will be enough to observe something which lookslike our function.)

  5. Setting $\epsilon = 1/n$ and intersecting over all $n$ gives you what I claimed.

Truly mind boggling!

Elle Najt
  • 20,740
1

Sorry I cant leave this as a comment. As a first idea, you can approximate the continuous function $f$ with a piece-wise linear continuous function (discretise the interval $[0,1]$. And on each subinterval $[t_k, t_{k-1}]$, you can use a brownian bridge to calculate the probability and show it's positive.

user3371583
  • 349
  • 2
  • 9
  • 1
    I don't understand how you would use a Brownian bridge to make this computation. Can you expand on this? – Elle Najt May 05 '17 at 06:00
1

Too long for a comment.

I don't think that Saz's proof works as it is written. In the induction you don't want to approximate the constant function $f(t_j)$ but the piecewise linear function $s\mapsto sf(f_{j+1})+(1-s)f_j$ (in fact, you can just assume that $f$ itself is piecewise linear). More specifically, you must define $A_j:=\{\sup_{[0,1/n]}|B_{t_j+s}-(sf(t_{j+1})+(1-s)f(t_j))|<\epsilon/2(n-j)\}$ and then show that given a line $s\mapsto ms+q$ with $|q|<\epsilon/2(n-j+1)$, then $\mathbb{P}(\sup_{[0,1/n]}|B_s-(ms+q)|<\epsilon/2(n-j))>0$. This can be done for example using Girsanov and that you already know that for all $\sigma>0$ you have $\mathbb{P}(\sup_{[0,1/n]}|B_s|<\sigma)>0$ (as shown here How to show that $P( \sup_{0 \leq s \leq 1} |B_s| \leq \epsilon) > 0$ for any $\epsilon > 0$?). In this case $\sigma=\epsilon/2(n-j)-\epsilon/2(n-j+1)$.

No-one
  • 656