3

This seems counter-intuitive to me since variance is a difference of expectations and afaik, unconditional expectation is a real number.

Apparently, $X_t$ where $dX_t = Y_t dW_t$, where $Y_t$ is an independent Brownian motion to $W_t$, has random variance.

Solving the SDE gives $X_t = X_0 + \int_0^t Y_s dW_s$

Computing the first and second moments give:

$E[X_t] = E[X_0] + E[\int_0^t Y_s dW_s]$

$E[X_t^2] = E[X_0^2] + 2E[X_0 \int_0^t Y_s dW_s] + E[\int_0^t Y_s^2 ds]$ by Itō isometry

$ = E[X_0^2] + 2E[X_0 \int_0^t Y_s dW_s] + \int_0^t E[Y_s^2] ds$ by Tonelli's theorem

I'm guessing that $E[\int_0^t Y_s dW_s]$ or $E[X_0 \int_0^t Y_s dW_s]$ is random? Why? It seems that $\int_0^t Y_s dW_s$ is a random variable with a unique mean given by $E[\int_0^t Y_s dW_s]$.

What am I not getting here?

BCLC
  • 13,459
  • 1
    The expectations listed above cannot be random. Conditional expectations could be random. Your description (as is) does not contain conditional expectations. – zoli Sep 14 '15 at 13:36
  • @zoli Is muaddib wrong? – BCLC Sep 14 '15 at 13:41
  • Yes, there is no such thing as a random variance or expectation (at least in this context). A variance of a random variable is always a constant. – saz Sep 14 '15 at 14:16
  • @saz How did muaddib have a random variable with random variance? I understand in the context of Bayesian inference or statistics, random variables can have random mean or variance, but in math? I can't imagine such – BCLC Sep 14 '15 at 14:23
  • 2
    @BCLC Ask him/her. The only possible explanation I can think of is the following: Suppose the underlying probability space $(\Omega,\mathcal{A},\mathbb{P})$ is a product space, i.e. $\Omega = \Omega_1 \times \Omega_2$ and $\mathbb{P} = \mathbb{P}_1 \otimes \mathbb{P}_2$. Then, if $X$ is a random variable on $\Omega$, one could call $$\int X(\omega_1,\omega_2) , d\mathbb{P}_1(\omega) = \mathbb{E}_1(X(\cdot,\omega_2))$$ a random expectation. Basically, the point is here that $X$ depends on multiple parameters and we only take the expectation with respect to one parameter, not both of them. – saz Sep 14 '15 at 14:35
  • 2
    (However, I've never seen anyone use this kind of terminoloy in probability theory.) – saz Sep 14 '15 at 14:37
  • 2
    @BCLC - I spent some time trying to figure out the root of the confusion and it comes down to me introducing the idea of random variance. It is a non-standard term tossed around sometimes in Math Finance and doesn't have a proper definition. – muaddib Sep 14 '15 at 21:17
  • 2
    (But what saz suggested it could be defined as is pretty on target). – muaddib Sep 14 '15 at 21:28

1 Answers1

3

I'm not too familiar with Brownian motions, but I think the problem is basically the same as in the case of individual random variables as in muaddib's answer: "Take the process $X$ where $X_0$, $X_1$ are iid positive random variables and $X_2$ is a normal random variable with variance $X_0+X_1$ and mean zero." Here $X_2$ could be said to have "random variance", namely in the sense that the conditional variance $\operatorname{Var}[X_2\mid X_0+X_1]=X_0+X_1$ is a random variable, which could be expressed as "the variance of $X_2$ depends on the value of $X_0+X_1$". Nevertheless, you are of course right that $X_2$ is a standard random variable with a standard non-random variance, which according to the law of total variance is given by

\begin{align} \operatorname{Var}[X_2]&=E_{X_0+X_1}(\operatorname{Var}[X_2\mid X_0+X_1])+\operatorname{Var}_{X_0+X_1}[E(X_2\mid X_0+X_1)]\\ &=E_{X_0+X_1}(X_0+X_1)+\operatorname{Var}_{X_0+X_1}\cdot\,0\\ &=E(X_0+X_1)\;. \end{align}

joriki
  • 238,052