2

In my previous question, I asked why $$ E(Y |X = x) = \int_\Omega Y (\omega)P(d\omega|X = x) = \frac{\int_{X=x} Y (\omega)P(d\omega)}{P(X = x)} = \frac{E(Y \, 1_{(X=x)})}{P(X = x)} $$ when $X$ is a discrete random variable and $P(X = x) \neq 0$.

Now I would like to consider when $X$ is a continuous random variable and its density at $x$ is not $0$, i.e. $f_X(x) \neq 0$, so that $f_{Y\mid X}(y \mid x) $ and $E(Y|X=x)$can be defined, whether there is a similar relation to the case above: $$ E(Y |X = x) = \int_\Omega Y (\omega)P(d\omega|X = x) = \frac{\int_{\mathbb{R}} y f_{X,Y}(x,y) dy}{f_X(x)} = (?) $$ close to representing $E(Y |X = x)$ in terms of some expectation?

Thanks and regards!

Tim
  • 47,382
  • 1
    While we are at it, you might make up your mind between $w$ (double-u) and $\omega$ (omega). // And you know this stuff is explained competently and with details in tons of well-written textbooks, many of them available online, don't you? – Did Nov 03 '11 at 05:59
  • Didier: Thanks! I corrected it. I guess so but fail to find where it is. – Tim Nov 03 '11 at 06:18
  • 2
    Are you kidding? Maybe the ones suggested on this previous question or on that one would be a good start! Same remark already here. Note that explanations there basically answer your present question but you did not bother to follow the lead formulated as a comment. Seven months later, the result is that you seem to be still bogged down in the same elementary definition problems. Oh well... – Did Nov 03 '11 at 07:05
  • It is not "seem to be". It is "are". Sorry, I can't see how the explanations in the fourth link "basically answer your present question". PS: I edited my post. – Tim Nov 06 '11 at 07:48
  • The fourth link explains that all this is based on the definition of conditional expectations and how to proceed to prove what you want here as well as in many other questions you asked about conditional expectations and distributions. You are still making circles. // Let me try once more: take any two random variables $X$ and $Y$ with $Y$ integrable. What does it mean to say that $E(Y\mid X)=u(X)$? If I give you the distribution of $(X,Y)$, say with a density $f$, how do you prove or disprove that $E(Y\mid X)=u(X)$? – Did Nov 06 '11 at 08:57
  • I have different understanding of "What does it mean to say that $E(Y∣X)=u(X)$?" (1) If you mean proving the existence of $u$ such that $E(Y∣X)=u(X)$, then this can be proved by using problem 13.3 of Billingsley's Probability and Measure (see the first quote here http://math.stackexchange.com/questions/78501/when-can-a-measurable-mapping-be-factorized). (2) If you mean some previously-defined $u$ as $u(x):=E(Y|X=x)$ with elementary definition of conditional expectation, then $E(Y|X)=u(X)$ a.e. is proved in Section 9.6 of Williams' Probability and Martingales. Correct me if I am wrong. – Tim Nov 08 '11 at 06:40
  • "If I give you the distribution of (X,Y), say with a density f, how do you prove or disprove that E(Y∣X)=u(X)?" Do you mean the same thing as (2) in my previous comment? – Tim Nov 08 '11 at 06:41
  • The question is: if provided with the distribution of (X,Y), how does one prove or disprove that E(Y|X)=X^3+5, say? If you have no idea about that, all the rest is pointless. – Did Nov 08 '11 at 06:54
  • $X^3+5$ is measurable wrt $\sigma(X)$. $\forall A \in \sigma(X)$, see if $\int_A X^3+5,dP=\int_A Y , dP$. How pointless is it? – Tim Nov 08 '11 at 07:01
  • Great. Now, you have everything you need to answer your own question. – Did Nov 08 '11 at 12:43

1 Answers1

1

Note, that if $X$ is continuous, then conditioning on $\{X=x\}$ in the classical sense is not possible any more, since $\mathbb P(X=x)=0$, for all $x \in \mathbb R$.

Instead $\mathbb{E}[Y|X=x]$ is defined via the conditional expectations given $\sigma$-algebras: Recall that the conditional expectation of $Y$ given $X$ is a random variable (denoted with $\mathbb{E}[Y|X]$), which is measurable with respect to $\sigma(X)$ (and has certain other properties). From the measurability we can deduce the existence of a measurable function $h$ (which depends on $Y$) such that $$ \mathbb{E}[Y|X] =h(X). $$ From here, we define $$ \mathbb{E}[Y|X=x] =h(x), \quad \text{for all} \ x\in \mathbb R. $$

If you want to envolve the probability densities $f_X$, $f_Y$ and $f_{X,Y}$, first note that $$ f_X(x)= \int_\mathbb{R} f_{X,Y}(x,y)\; dy. $$ Then we can write: $$ \mathbb{E}[XY]= \int_\mathbb{R} \int_\mathbb{R} x\cdot y \cdot f_{X,Y}(x,y) \; dy \; dx = \int_\mathbb{R}\underbrace{\biggl( \int_\mathbb{R} y \cdot \frac{f_{X,Y}(x,y)}{f_X(x)} \; dy \biggr)}_{=h(x)} \cdot x \cdot f_X(x)\;dx $$ Maybe this answers your question.

Cettt
  • 2,494