For Brownian Motions, I am trying to prove that:
- For large $n$: The expected total variation is infinite and the expected quadratic variation is finite
- For large $n$ : The total variation equals to the expected total variation and quadratic variation is equal to the expected quadratic variation
We start with a review of Moment Generating Functions:
The moment generating function (MGF) of a random variable $X$ centered around some constant $c$. (where $f(x)$ is the probability distribution function of $x$) is the Expected Value of $e^{t(X-c)}$:
$$M_{X-c}(t) = E[e^{t(X-c)}] = \int_{-\infty}^{\infty} e^{t(x-c)} f(x) dx$$
For the $k$-th moment of $X$ around "c", we take the $k$-th derivative (i.e., differentiate $k$ times) of the MGF:
$$M_{X-c}^{(k)}(t) = \frac{d^k}{dt^k} M_{X-c}(t) = \frac{d^k}{dt^k} \int_{-\infty}^{\infty} e^{t(x-c)} f(x) dx$$
To get the $k$-th moment around some point $c$, we evaluate this derivative at $t=0$:
$$E[(X-c)^k] = M_{X-c}^{(k)}(0)$$
As an example, for a Normal Distribution, the MGF (centered around the point $c=0$) is given by:
$$M_{X}(t) = e^{t\mu + \frac{1}{2}t^2\sigma^2}$$
- The 4th Moment of a Normal Distribution can be calculated as:
$$M_{X}(t) = e^{t\mu + \frac{1}{2}t^2\sigma^2}$$
$$M_{X}^{(4)}(t) = \frac{d^4}{dt^4} M_{X}(t) = \frac{d^4}{dt^4} e^{t\mu + \frac{1}{2}t^2\sigma^2}$$
Evaluating this at $t=0$ gives us the 4th moment:
$$E[X^4] = M_{X}^{(4)}(0)$$
$$E[X^4] = 3\sigma^4 + 6\sigma^2\mu^2 + \mu^4$$
When $\mu=0$, this becomes:
$$E[X^4] = 3\sigma^4$$
This result will be useful for showing later on that $$Var(X^2) = E[(X^2)^2] - [E(X^2)]^2 = E[X^4] - [E(X^2)]^2 $$
Now, we show that the (Expected) Total Variation of the Brownian Motion is infinite
First assume some Brownian Motion $B_t$ defined over an interval $t$ and make $k=n$ partitions. Next, define the Total Variance as:
$$f_n(W) = \sum_{k=1}^{n} \left| B_{t,k}(W) - B_{t,k-1}(W) \right|$$
We know that from first principles, the above expression is Normal with mean = 0 and variance = $\Delta t$. However, the above expression also contains an absolute value sign. Therefore, the above expression is actually an "absolute value normal random variable".
Assume a random normal variable $X$ with $\mu=0$ and $\sigma=1$ (i.e. a standardized normal). If we want to take the absolute value of $X$, it will only be defined over non-negative values. Therefore, we need to "shift" the "negative mass" of this function and add it to the positive part, such that this new probability distribution function still integrates to $1$. Conceptually, we think of scaling this new function by a factor of $2$ to respect this integration constraint:
$$f(x) = 2 \cdot \frac{1}{\sqrt{2\pi}} e^{ -\frac{x^2}{2} } = \sqrt{\frac{2}{\pi}} e^{ -\frac{x^2}{2} }$$
I thought of this trick to find out the integral of the above function:
$$z = \int_0^\infty e^{-x^2/2} dx $$ $$f(x) = \sqrt{\frac{2}{\pi}} \int_0^\infty e^{-x^2/2} dx = 1$$ $$\sqrt{\frac{2}{\pi}} \cdot z = 1$$ $$z = \sqrt{\frac{\pi}{2}}$$
However I am not sure how to take its expectation. Using the formula on Wikipedia (https://en.wikipedia.org/wiki/Half-normal_distribution) .... if $X_n$ is a Brownian Motion such that $X_n \sim N(0, \Delta t)$, then the absolute value of $X_n$ becomes:
$$ |X_n| \sim N\left[ \sqrt{\frac{2 \Delta t}{\pi}}, \Delta t \cdot \left(1 - \frac{2}{\pi}\right) \right] $$
Now, since our Total Variation function $f_n$ contains $n$ sums of $|X_n|$, the Expected Value becomes multiplied accordingly (note that $\Delta t = \frac{t}{n}$):
$$E(f_n) = n \sqrt{ \frac{2 \Delta t}{\pi}} = n \sqrt{ \frac{2 \frac{t}{n}}{\pi}} = \sqrt{\frac {2nt}{\pi}} $$
Taking the limit, we can see that Total Variation is infinite:
$$\lim_{{n \to \infty}} E(f_n) = \lim_{{n \to \infty}} \sqrt{\frac {2nt}{\pi}} = \infty$$
We can also see the variance of $f_n$:
$$Var(f_n) = n\cdot \Delta t \cdot \left(1 - \frac{2}{\pi}\right) = n\cdot \frac{t}{n} \cdot \left(1 - \frac{2}{\pi}\right) = t \cdot \left(1 - \frac{2}{\pi}\right) $$
Next, we use the Chebyshev Inequality to show that the Expected Total Variance for the Brownian Motion converges to actual Total Variance:
$$P( |f_n - E(f_n)| > \epsilon) \ \leq \frac{Var(f_n)}{\epsilon^2}$$
This shows us that the probability of $f_n$ being greater than $E(f_n)$ does not depend on $n$ (i.e. $f_n$ and $E(f_n)$ grow together). As I see it, we can use the above statement in which we proved that $E(f_n)$ becomes infinite (for larger $n$) to suggest that both $E(f_n)$ and $f_n$ tend to infinity.
Now, we have shown that $E(f_n)$ equals to $f_n$ in probability, and both $E(f_n)$ and $f_n$ tend to infinity for large values of $n$.
Finally, we need to derive the formula for the Quadratic Variation of the Brownian Motion . This can be done by squaring the formula for the total variation:
$$g_n(W) = [f_n(W)]^2 = \sum_{k=1}^{n} \left [ B_{t,k}(W) - B_{t,k-1}(W) \right]^2 = \sum_{k=1}^{n} (X_n)^2 $$
From first principles, we know that:
$$E(X_n) = 0$$ $$Var(X_n) = E(X_n^2) - E(X_n)^2 = E(X_n^2) - 0 = E(X_n^2) = \Delta t$$
Now, we can start taking expectations of $g_n(W)$. Basically, we need to multiply $E(X_n)$ by $n$ (unlike the previous expectation of the total variation, the expectation of the quadratic variation is bounded, i.e. finite):
$$E(g_n) = n \cdot E(X_n^2) = n \Delta t = n \frac{t}{n} = t$$
Now, we show variance of $g_n$. This is where we use the results from the Moment Generating Function:
$$E(X_N^4) = 3 (\sigma^2)^2 = 3 [Var(X_N)]^2 = 3 (\Delta t)^2$$ $$Var(X_n^2) = E(X_n^4) - E(X_n^2)^2 = E(X_n^4) - (\Delta t)^2$$ $$Var(X_n^2) = 3 (\Delta t)^2 - (\Delta t)^2 = 2 (\Delta t)^2 = 2 (\frac{t}{n})^2$$
We can see that:
$$Var(g_n) = \sum_{k=1}^{n} Var[ (X_n)^2] = n \cdot [Var (X_n)^2] = n \cdot \frac{2t^2}{n^2} = \frac{2t^2}{n} $$
$$\lim_{{n \to \infty}} Var(g_n) = \lim_{{n \to \infty}} \frac{{2t^2}}{n} = 0$$
Finally, using Chebyshef's Inequality, we can see that:
$$P( |g_n - E(g_n)| > \epsilon) \ \leq \frac{Var(g_n)}{\epsilon^2}$$
This again shows us that $g_n$ and $E(g_n)$ are equal to each other in probability for large $n$.
Conclusion: Have I done all of this correctly?