0

Under what conditions does it hold that

$$E[X_{n+1}\mid X_n] = E[X_{n+1}\mid\mathscr{F}_n]$$

if we are given a stochastic process $X = (X_n)_{n \geq 0}$ on a filtered probability space $(\Omega, \mathscr{F}, (\mathscr{F}_n)_{n \in \mathbb N}, \mathbb{P})$ where $\mathscr{F}_n = \mathscr{F}_n^X := \sigma(X_0, X_1, \ldots, X_n)$

?

I was under the impression that the equality held true only for Markov processes, but I guess there may be other conditions.


Markov property is:

$$E[f(X_{t})\mid X_s] = E[f(X_{t})\mid\mathscr{F}_s]$$

$\forall 0 \le s \le t$ and $\forall f: \mathbb R \to \mathbb R$ bounded and measurable.

So, if $X_0, X_1, \ldots, X_n, \ldots$ is a Markov process, then we have

$$E[X_{n+1}\mid X_n] = E[X_{n+1}\mid\mathscr{F}_n]$$

but what are other sufficient conditions?

BCLC
  • 13,459
  • 1
    My statement was wrong. – muaddib May 31 '15 at 16:33
  • @muaddib Why is your statement wrong? So mookid is wrong too? – BCLC May 31 '15 at 16:36
  • Well $X_n$ might just contain information about the state at time $n$, where $\mathcal{F_n}$ contains all info up till time $n$. So take the process $X_0, X_1$ are iid mean zero and $X_2 = X_1 + X_0$. Then $E[X_2 | X_1] = X_1$ but $E[X_2 | \mathcal{F_1}] = X_1 + X_0$. Need something more like the information available at exactly n. – muaddib May 31 '15 at 16:40
  • I don't know who mookid is. Regarding my previous point though, just because a process isn't markov wrt the states $X_n$ doesn't mean that particular expectation can't be correct. Though I can't think of a nice example. – muaddib May 31 '15 at 17:01
  • @muaddib mookid is the one in the link

    E[Xn+1|Xn]=1/2×2Xn+1/2×0=Xn shareeditflag edited Nov 1 '14 at 3:06

    answered Oct 15 '14 at 13:37

    mookid 23k51842

    Thanks. Aren't we supposed to show E[Xn+1|Fn]=Xn ? – BCLC Oct 15 '14 at 13:43
    1 up voted

    yes. In the case of F being the natural filtration of the process, this is the same, but you are right. – mookid Oct 15 '14 at 13:44

    – BCLC May 31 '15 at 18:28
  • @muaddib "just because a process isn't markov wrt the states Xn doesn't mean that particular expectation can't be correct." --> Why not? That is the definition of Markov, I think? Of course, there's no evidence so far to say that the stochastic process we have isn't Markov... – BCLC May 31 '15 at 18:28
  • 1
    @muaddib http://en.wikipedia.org/wiki/Markov_property#Alternative_formulations – BCLC May 31 '15 at 18:29

1 Answers1

3

This is by no means a full answer to your question, but I've at least found an example of a non-markovian process that satisfies that equation. (We've run out of space in the comments).

Take the process $X$ where $X_0, X_1$ are iid positive random variables and $X_2$ is a normal random variable with variance $X_0 + X_1$ and mean zero.

Then $E[X_2|X_1] = 0$ and $E[X_2|\mathcal{F_1}]= 0$.

muaddib
  • 8,267
  • Wait. A random variable can have random variance? – BCLC May 31 '15 at 19:08
  • How is that non-Markov? – BCLC May 31 '15 at 19:09
  • Sure. Consider the SDE: $dX_t = Y_t dW_t$ where $Y_t$ is an independent brownian motion to $W_t$. How is that markov? Because $X_2$'s density depends on $X_1$ and $X_0$. To be markov it would have to just depend on $X_1$. – muaddib May 31 '15 at 19:11
  • Thanks muaddib, but how does that have random variance? Is $E[\int_0^t Y_s dW_s]$ random? Why/why not? – BCLC Sep 14 '15 at 13:20
  • I'm defining it to have random variance. Namely, look at the outcomes of $X_0$ and $X_1$, then use that as the variance of the normally distributed $X_2$. Is $E[\int_0^t Y_s dW_s]$ random? no, as a commenter in your link stated, it is an unconditional expectation. i.e. No "randomness" is left for it to depend on. – muaddib Sep 14 '15 at 14:09
  • I don't understand. What is the point of your SDE? I thought it was an answer to 'a random variable can have random variance?' – BCLC Sep 14 '15 at 14:21
  • @BCLC I'm not sure how you think that question was associated to my answer. I was demonstrating that a process $X_i$ can satisfy the expectation in your question without being Markovian (In the example, $X_2$ depends not on just the previous state, but also the state before). Actually, you pointed out the "alternative formulation" in your comments above which shows what is actually required (considering a general function $f$). – muaddib Sep 14 '15 at 17:52
  • (Unconditionally,) $X_2$ does not have random variance. So I guess your example is wrong? – BCLC Sep 14 '15 at 19:23
  • I commented on your other question regarding this. I would just drop "random variance" as a concept since it seems to provide much confusion. It is one of the ways I just personally look at things. But the example still stands for the reasons I gave in the previous comment. – muaddib Sep 14 '15 at 21:20
  • @BCLC: You're reading too much into "random variance." You start with two random variables $X_0,X_1$, which will give you physical values after you draw them. You then draw a normal random variable $X_2\equiv N(0,\sigma^2)$ with mean 0 and variance $\sigma^2=X_0+X_1$. The actual variance of $X_2$ is not random but the parameter $\sigma$ is random. Afterall, when you compute the variance of $X_2$, then $\sigma$ comes with some distribution so you'll get a fixed value for the variance of $X_2$. – Alex R. May 06 '16 at 18:03
  • @AlexR. Right $\sigma^2 = Var[X_2 | X_0, X_1]$? – BCLC May 07 '16 at 09:11
  • @BCLC: No. $\sigma^2=\mbox{Var}[X_2]$ – Alex R. May 07 '16 at 15:50