6

Let $X$, $Y$ be r.v. with finite second moments. Suppose $\mathbb{E}(X\mid\sigma (Y))=Y$, and $\mathbb{E}(Y\mid\sigma(X))=X$, show that $\Pr(X=Y)=1$.

So what I have done is this, I first consider $\mathbb{E}((X-Y)^2)$ by conditioning on $X$ and $Y$

$\mathbb{E}((X-Y)^2\mid X)=\mathbb{E}(X^2\mid X)-2\mathbb{E}[XY\mid X]+\mathbb{E}[Y^2\mid X]=X^2-2X^2+\mathbb{E}(Y^2\mid X)=-X^2+\mathbb{E}[Y^2\mid X]$, and similarly for conditioning on $Y$, but I am not sure how to subtract them properly to make use of them. Thanks

In the end I have $\mathbb{E}((X-Y)^2\mid X)=-X^2+\mathbb{E}[Y^2\mid X]$;
$\mathbb{E}((X-Y)^2\mid Y)=-Y^2+\mathbb{E}[X^2\mid Y]$

Glenn M
  • 61
  • $E(X^2\mid X)=X^2$ and not $E(X^2)$. – Did Oct 21 '11 at 21:41
  • Does $\sigma$ mean standard deviation or are you saying that the equalities hold for all (measurable) functions $\sigma$? – Dilip Sarwate Oct 21 '11 at 21:44
  • $\sigma(X)$ is the $\sigma$ field generated by X – Glenn M Oct 21 '11 at 21:57
  • @Dilip: I took $\sigma(Y)$ to mean the sigma-algebra generated by the random variable $Y$, i.e. the coarsest sigma-algbra that makes $Y$ measurable. – Michael Hardy Oct 21 '11 at 21:58
  • @Glenn: certainly $\mathbb{E}(X^2\mid X) = X^2$. – Michael Hardy Oct 21 '11 at 22:02
  • Yes, I have corrected that. Thanks I am stuck on how to use those two equation in the end to show P(X=Y)=1 – Glenn M Oct 21 '11 at 22:04
  • Is there anyway to simplify $\mathbb{E}[Y^2|X]$? – Glenn M Oct 21 '11 at 22:05
  • I've posted an answer using a different approach. In the mean time, "anyway" is a perfectly good word---an adverb---that does not mean the same thing as "any way". – Michael Hardy Oct 21 '11 at 22:22
  • And now you are ready for the real stuff, which is to prove that the same conclusion holds without the hypothesis that $X$ and $Y$ are square integrable but with the minimal hypothesis required for the exercise to make sense, namely, that $X$ and $Y$ are integrable. – Did Oct 21 '11 at 22:30

1 Answers1

4

\begin{align} \operatorname{cov}(X,Y) & = \operatorname{E}(XY) - (\operatorname{E}X)(\operatorname{E}Y) = \operatorname{E}(\operatorname{E}(XY \mid X)) - (\operatorname{E}X)(\operatorname{E}Y) \\[10pt] & = \operatorname{E}(X\operatorname{E}(Y\mid X)) - (\operatorname{E}X)(\operatorname{E}Y) = \operatorname{E}(X^2) - (\operatorname{E}X)(\operatorname{E}Y). \end{align}

Now use the fact that the expectations of $X$ and $Y$ are equal: $\operatorname{E}(X)= \operatorname{E}(\operatorname{E}(X\mid Y)) = \operatorname{E}(Y)$.

We get $\operatorname{cov}(X,Y) = \operatorname{E}(X^2) - (\operatorname{E}X)^2 = \operatorname{var}(X)$. By the same argument, we get $\operatorname{cov}(X,Y) = \operatorname{var}(Y)$. Hence $\operatorname{cov}(X,Y) = \operatorname{var}(X)=\operatorname{var}(Y)$.

Hence the correlation between $X$ and $Y$ is $1$ (provided neither of them has variance $0$, but proving the result you want when that happens is trivial).

  • Here's another point of view on this: The hypotheses amount to saying there is no regression towards the mean. Therefore the two random variables must be perfectly correlated. – Michael Hardy Oct 22 '11 at 01:31
  • how does correlation of 1 imply a.s. convergence? thanks – Glenn M Oct 22 '11 at 02:55
  • 1
    @GlennM: The Cauchy–Schwarz inequality says that $|\operatorname{cov}(X,Y)| \le \sqrt{\operatorname{var}(X)\operatorname{var}(Y)}$ with equality if and only if $\Pr(X=aY+b)=1$ for some constants $a$ and $b$. In the latter case, we have $\mathbb{E}(X \mid Y) = aY+b$. And the hypotheses you started with tell you that $a=1$ and $b=0$. – Michael Hardy Oct 22 '11 at 03:16