Let X and Y be random variables such that $E(X|Y)=\frac Y 2$ and $E(Y|X)=\frac X 2$. Does it follow that X and Y are 0? If not is their a simple example of such random variables? Motivation: if $E(X|Y)= Y $ and $E(X|Y)=Y$ then X=Y necessarily. This is easy to prove: if X>0 and Y>0 we can write $E(\frac X Y +\frac Y X)=E(\frac X Y) +E(\frac Y X)=1+1=2$ and $x+\frac 1 x \geq 2$ with equality if and only if $x=1$. For the general case we can use $X^{+}+1$ and $Y^{+}+1$ in place of X and Y to get $X^{+}=Y^{+}$ and a similar argument for $X^{-}$ and $Y^{-}$.
-
I meant simple example of non-zero random variables. – Kavi Rama Murthy Dec 22 '17 at 08:03
-
On your motivating problem $E[X|Y]=Y, E[Y|X]=X$, I agree that if $X>0, Y>0$ then $X=Y$ with prob 1. I do not know how you conclude $X^+=Y^+$ for the general case since $E[X|Y]=Y$ does not tell us the value of $E[X^+|Y]$. However: (i) For the general case, if you assume $X$ and $Y$ have finite variances then you can show $E[(X-Y)^2]=0$, (ii) For the case $X>0, Y>0$ you can repeat your argument under the relaxed assumption $E[X|Y]\leq Y$, $E[Y|X]\leq X$ to get $X=Y$ with prob 1. – Michael Dec 22 '17 at 20:36
-
No need for finite variances. We have $X^{+} \leq E(Y^{+}|X)$ and $Y^{+} \leq E(X^{+}|Y)$ Taking expectations we see that $EX^{+}=EY^{+}$. Now $E(Y^{+}|X)-EX^{+}$ is a non-negative random variable with zero mean hence it is 0 almost surely. Do the same with the second equation. You have now reduced the proof to the case on non-negative X and Y. Just ad 1 to both to make them strictly >0. – Kavi Rama Murthy Dec 24 '17 at 11:48
-
How do you claim $X^+\leq E[Y^+|X]$? – Michael Dec 25 '17 at 02:27
-
Michel, this is easy: $X \leq E[Y|X] \leq E{Y^{+}|X}$ because $Y \leq Y^{+}$. Also, $0 \leq E{Y^{+}|X}$. Put these two together to get $X^{+} \leq E{Y^{+}|X}$ – Kavi Rama Murthy Dec 26 '17 at 04:56
-
I see, thanks for the explanation. – Michael Dec 26 '17 at 18:01
2 Answers
Let $A, B, C$ be independent with \begin{align*} &P[B=0]=P[B=1]=1/2\\ &P[C=0]=P[C=1]=1/2\\ &P[A=-1]=P[A=1]=1/2 \end{align*} Define: \begin{align} X &= AB\\ Y &= AC \end{align}
Then \begin{align} &E[X|Y=1] = E[AB|AC=1]=E[AB|A=1, C=1]=1/2\\ &E[X|Y=0] = E[AB|C=0] = E[A]E[B]=0 \\ &E[X|Y=-1] = E[AB|AC=-1] = E[AB|A=-1, C=1] =-1/2 \end{align} So $E[X|Y]=Y/2$. By symmetry we also get $E[Y|X]=X/2$.

- 23,905
Here is a generalized result related to the motivating example: Suppose random variables $X$ and $Y$ satisfy $E[|X|]<\infty$, $E[|Y|]<\infty$, and $$ E[X|Y]\leq Y, E[Y|X]\leq X $$ Then $X=Y$ with probability 1.
Proof: Fix $M>0$ as a (large) integer. Define truncated random variables: \begin{align} A_M = \left\{ \begin{array}{ll} X &\mbox{ if $X \geq -M$} \\ -M & \mbox{ otherwise} \end{array} \right.\\ B_M = \left\{ \begin{array}{ll} Y &\mbox{ if $Y \geq -M$} \\ -M & \mbox{ otherwise} \end{array} \right.\\ \end{align} Then $$ \lim_{M\rightarrow\infty} P[X\neq A_M] = \lim_{M\rightarrow\infty} P[Y\neq B_M] = 0$$ Because of this, it can be shown that for any random variable $Z$ that satisfies $E[|Z|]<\infty$ we have $$ \lim_{M\rightarrow\infty} E[Z1_{\{X \neq A_M\}}] = \lim_{M\rightarrow\infty} E[Z1_{\{Y\neq B_M\}}] = 0 \quad (**) $$ Define $c = M+1$. So $A_M+c\geq 1$ and $B_M+c \geq 1$ and we can apply an argument similar to that suggested by Kavi, \begin{align} E[(A_M+c)/(B_M+c)] &= E[E[(A_M+c)/(B_M+c)|Y]]\\ &= E[1/(B_M+c) E[(A_M+c)|Y]]\\ &= E[1/(B_M+c) E[(X + c) + (A_M-X)|Y]]\\ &\leq E[1/(B_M+c)(Y+c + E[A_M-X|Y])] \\ &= E[1/(B_M+c)(B_M+c + (Y-B_M) + E[A_M-X|Y])] \\ &=E\left[1 + \frac{Y-B_M}{B_M+c} + \frac{A_M-X}{B_M+c} \right]\\ &\leq 1 + E[|Y-B_M|] + E[|A_M-X|] \end{align} By symmetry we also get $$ E[(B_M+c)/(A_M+c)] \leq 1 + E[|Y-B_M|] + E[|A_M-X|] $$ Define $f(M) = E[|Y- B_M|] + E[|A_M-X|]$. By fact (**), it can be shown that $f(M)\rightarrow 0$. Thus $$ E[(B_M+c)/(A_M+c)] + E[(A_M+c)/(B_M+c)] \leq 2 + 2f(M) \rightarrow 2$$ On the other hand, for all $M$ and all realizations of the random variables we have $$ (B_M+c)/(A_M+c) + (A_M+c)/(B_M+c)\geq 2 $$ with "near equality" only when $A_M+c \approx B_M+c$. Thus, for any $\epsilon>0$ we get: $$ \lim_{M\rightarrow\infty} P[|B_M-A_M|>\epsilon] =0 $$ However, $$ P[|X-Y|>\epsilon] \leq P[X \neq A_M] + P[Y\neq B_M] + P[|A_M-B_M|>\epsilon] $$ Taking a limit as $M\rightarrow\infty$ gives $$ P[|X-Y|>\epsilon] = 0$$ This holds for all $\epsilon>0$ and so $P[X=Y]=1$. $\Box$

- 23,905
-
1This generalized result follows immediately from the older version: taking expectations in both the equations we see that EX=EY. Now $(Y-EX|Y)$ is a non-negative random variable with zero mean and hence it is almost surely 0. Similarly with the second equation. – Kavi Rama Murthy Dec 24 '17 at 11:42
-
You are right that $Y-E[X|Y]$ is a nonnegative random variable with zero mean and so we indeed get $Y=E[X|Y]$ with prob 1. I didn't notice that, I just noticed the proof does not change if we replace $=$ with $\leq$. So if we want to prove the simplified statement assuming $E[X|Y]=Y$ and $E[Y|X]=X$, we just replace one $\leq$ with an $=$ in the above proof. So it seems the proof does not really get any easier with the simplified statement (though the simplified statement is nicer to state and the structure of making the $\leq$ statement a corollary is also nice). – Michael Dec 25 '17 at 02:16