2

I've gotten so caught up in measure-theoretic probability that I'm actually having trouble showing this simple result. Let $X$ be an integrable random variable. Then $$ \mathrm E[X \mid X=x] = \int_{\Omega} X(\omega)\, P^X(\mathrm d\omega \mid x) = \int_{X(\Omega)} x \, P_{X\mid X}(\mathrm dx | x) = ? $$ The first equality is the definition of the conditional expectation of a random variable w.r.t. an event with zero probability, and so $P^X(\cdot \mid \cdot)$ is the regular conditional probability of $P$ given $X$. I then tried to push forward this integral onto the range of $X$ using the conditional distribution of $X$ given $X$, $ P_{X\mid X}(\cdot | \cdot)$, but it's not clear to me either of these integrals equals $x$.

I'm clearly missing something pretty obvious and would appreciate an extra eye!

bcf
  • 3,120
  • Would you edit the integrals, they are blatantly wrong – Cardinal Jul 24 '15 at 22:11
  • By the way, How a definite value can have a PDF ? When X is definite, How could you explain the Expectation ? Expectation of which uncertainty ? – Cardinal Jul 24 '15 at 22:14
  • $$X|X=x\longrightarrow E[x]=x\int_{0}^{\infty}f_X(x)dx=x$$ – Cardinal Jul 24 '15 at 22:19
  • @Cardinal Re PDF: I'm not assuming $X$ has a pdf. Re integrals: I'm just using the definitions found here: http://math.stackexchange.com/questions/496608/formal-definition-of-conditional-probability – bcf Jul 24 '15 at 23:49
  • @bcf The subject of conditional expectations (not the trivial case when one conditions by events of positive probability) is one of the most botched up in (US...) curricula and, as a consequence, on math.SE. A few users try to remedy this sorry state of afairs by providing the rigorous definitions needed for a proper approach but, since the site has next to zero memory, this is a Sisyphus task. Anyway, if you want to master the subject, I would recommend to get yourself a good textbook and to study it. .../... – Did Jul 26 '15 at 10:02
  • .../... David Williams' Probability with martingales is a good choice (small book, very clear, the first half is already enough). – Did Jul 26 '15 at 10:02
  • 2
    To answer the question... When $P(X=x)=0$, $E(Y\mid X=x)$ is defined as $a(x)$ where the function $a$ is measurable and such that $E(Y\mid X)=a(X)$ almost surely. The "almost surely" makes that $a$ is only unique up to negligible sets for $P_X$, and this has unfortunate consequences. For example, assume that $X=Y$ is unform on $(0,1)$ and let $a(y)=y+(z-y)\mathbf 1_{y=x}$ for some $z\ne x$. Then $E(X\mid X)=a(X)$ almost surely (one can check the two conditions necessary for $a(X)$ to be (a version of) $E(X\mid X)$ hold) but, obviously, $E(X\mid X=x)=z\ne x$. To sum up, the result is wrong. – Did Jul 31 '15 at 16:29
  • https://math.stackexchange.com/questions/431422/how-can-i-show-that-the-conditional-expectation-ex-mid-x-x – StubbornAtom Dec 10 '19 at 17:43

3 Answers3

3

Edit: this is for a previous version of the question with no independence assumptions.

Start from definitions. You are trying to find $E[X|\sigma(X)]$. This by definition must satisfy $\int_A E[X|\sigma(X)]dP = \int_A XdP$ for all $A\in\sigma(X)$, and be $\sigma(X)$ measurable. Try $E[X|\sigma(X)]=X$ and verify it satisfies all definitions. Since $X(\omega)=x$ is equivalent to $X^{-1}(x)\in\sigma(P)$, it follows that $E[X|X=x]=x$.

Alex R.
  • 32,771
  • Okay, I see the result follows when we're considering the conditional expectation wrt a sigma algebra, which is a random variable. However as I understand it, $\mathrm E[X \mid X=x]$ is a function on $X(\Omega)$, as would be $\mathrm E[Y \mid X=x]$ if we had the "usual" case of conditional expectation of another random variable given $X=x$. Have I misunderstood? – bcf Jul 24 '15 at 23:52
  • @bcf If $E[Y|X]$ is measurable function $f$, then $E[Y|X=x]=f(x) \in X(\Omega)$, and it is not function but an element of the set $X(\Omega)$. – Zoran Loncarevic Jul 25 '15 at 17:56
  • @bcf Also, sigma algebra is collection of sets, and certainly not a random variable. Conditioning with respect to a random variable $X$ is exactly the same as conditioning on sigma algebra $\sigma(X)$ induced by a random variable $X$. – Zoran Loncarevic Jul 25 '15 at 18:11
  • @bcf Please correct "If E[Y|X] is measurable function f, then E[Y|X=x]=f(x)" in Zoran's comment to "If E[Y|X]=f(X) where f is a measurable function, then E[Y|X=x]=f(x) for almost every x". – Did Jul 26 '15 at 10:13
  • Yes, I was careless. Domain of conditional expectation $E[Y|X]$ is $\Omega$ and certainly not $X(\Omega)$; but, since it is $\sigma(X)$-measurable, we can represent it as a composition $f \circ X$ where $f$ is some measurable function. – Zoran Loncarevic Jul 26 '15 at 12:08
  • To be clear: the result that the question asks for and that this answer "proves", is wrong. – Did Jul 31 '15 at 16:29
1

Recall that $\{X=x\}$ is shorthand for $$X^{-1}(\{x\}) = \{\omega\in\Omega : X(\omega) = x\}.$$ If $A$ is an event, recall the usual definition of conditional probability $$\mathbb P(A\mid X=x) = \frac{\mathbb P\left(A\cap\{X=x\}\right)}{\mathbb P(X=x)}, $$ provided that $\mathbb P(X=x)>0$. Similarly, we define $$\mathbb E[Y\mid X=x] = \frac{\mathbb E\left[Y1_{\{X=x\}}\right]}{\mathbb P(X=x)}. $$ In the case where $Y=X$, we have \begin{align} \mathbb E[X\mid X=x] &= \frac{\mathbb E\left[X1_{\{X=x\}}\right]}{\mathbb P(X=x)}\\ &=\frac{x\mathbb P(X=x)}{\mathbb P(X=x)}\\ &=x, \end{align} as \begin{align} \mathbb E\left[X1_{\{X=x\}}\right] &= \int_{\Omega} X1_{\{X=x\}}\mathsf d\mathbb P\\ &= \int_{\{X=x\}} X\mathsf d\mathbb P\\ &= x\mathbb P(X=x). \end{align}

Math1000
  • 36,983
  • okay, I see it holds in the discrete case. I'm really interested in the case $P(X =x) = 0$, hence me using the integral definitions. – bcf Jul 25 '15 at 12:35
  • In that case the conditional expectation isn't well-defined, because the integral of any function of a set of measure zero is zero.. – Math1000 Jul 25 '15 at 13:24
  • 3
    @Math1000: it is perfectly well defined as that is the entire point of the measure theoretic definition. – Alex R. Jul 25 '15 at 16:09
  • @AlexR. You are correct. I was a bit sleep-deprived when I wrote that comment :) – Math1000 Jul 25 '15 at 17:07
  • When $P(X=x)=0$, $E(Y\mid X=x)$ is defined as $a(x)$ where the function $a$ is measurable and such that $E(Y\mid X)=a(X)$ almost surely. The "almost surely" makes that $a$ is only unique up to negligible sets for $P_X$, and this has unfortunate consequences. For example, assume that $X=Y$ is unform on $(0,1)$ and let $a(y)=y+(z-y)\mathbf 1_{y=x}$ for some $z\ne x$. Then $E(X\mid X)=a(X)$ almost surely (one can check the two conditions necessary for $a(X)$ to be (a version of) $E(X\mid X)$ hold) but, obviously, $E(X\mid X=x)=z\ne x$. To sum up, the result is wrong. – Did Jul 31 '15 at 16:28
  • 2
    @Did Is it really necessary to post this as a comment to the question AND all of the answers? Wouldn't it make more sense to post it as an answer? – Math1000 Jul 31 '15 at 18:00
  • Not all the answers, not necessary, not a problem either. – Did Jul 31 '15 at 19:07
1

Assuming that it exists at all, any such conditional density function of $X$ given $X=x$ must clearly be zero everywhere but at that point, and the integral of this function over the entire support of $X$ must be $1$ (per definition of density function).

This is, of course, not a well-behaved function, but it is the very definition of a generalised function known as the Dirac delta function, $\delta(s-x)$.

If is a property of this function that: $\int_\Bbb R g(s)\;\delta(s-x)\operatorname d s = g(x)$ .

So we have:

$$\begin{align} \mathsf E(X\mid X=x) & = \int_{X(\Omega)} s\;{\mathsf P}_{X}(\operatorname d s\mid X=x) \\ &= \int_{X(\Omega)} s\, \delta(s-x) \operatorname d s \\ & = x \end{align}$$


Note: To avoid confusion, the token for the bound variable of integration should not be that of the constant $x$.   So I have used $s$.

Graham Kemp
  • 129,094
  • When $P(X=x)=0$, $E(Y\mid X=x)$ is defined as $a(x)$ where the function $a$ is measurable and such that $E(Y\mid X)=a(X)$ almost surely. The "almost surely" makes that $a$ is only unique up to negligible sets for $P_X$, and this has unfortunate consequences. For example, assume that $X=Y$ is unform on $(0,1)$ and let $a(y)=y+(z-y)\mathbf 1_{y=x}$ for some $z\ne x$. Then $E(X\mid X)=a(X)$ almost surely (one can check the two conditions necessary for $a(X)$ to be (a version of) $E(X\mid X)$ hold) but, obviously, $E(X\mid X=x)=z\ne x$. To sum up, the result is wrong. – Did Jul 31 '15 at 16:28