1

I have some elementary problem understanging conditional probabliity and expectation with reference to different measure spaces, and maybe the whole theory.

Let's take this problem as an example for attention:

$X,Y$ are iid exponentially distributed random variables with parameter $\lambda$ on probability space $(\Omega, \mathrm F, \mathrm P)$. Let $Z=\min\{X,Y\}$. Compute $\mathrm E(Z|X+Y=M)$ for given M.


It is straightforward to compute PDF $g(x, y)$ of $(X,Y)$, since they are independent, and it's just $g_x \cdot g_y$ if $g_x, g_y$ are $X, Y$ densities respectively. Rewriting $ Z=f(X,Y)=\min\{X, Y\}$, given that, expectation of $Z|X+Y=M$ would be

$$\mathrm E(Z|X+Y=M)=\mathrm E(f(X, Y)|X+Y=M)=\frac{\int_{\{(x,y): x+y=M\}}{f(x,y) \cdot g(x,y) \mathrm d(x,y)}}{ \mathrm P(X+Y=M)}$$

and $$ \mathrm P(X+Y=M) = \int_{\{\omega: X(\omega)+Y(\omega)=M\}}{\mathrm dP_{(X,Y)}} $$

where $ \mathrm dP_{(X,Y)} $ is measure over the space on which $(X,Y)$ is defined.

But now, integral $\int_{\{(x,y): x+y=M\}}{f(x,y) \cdot g(x,y) \mathrm d(x,y)}$ should be equals to zero, because we are integrating over line which has zero Lebesgue measure, and the same with the denominator.

Could you please explain me or give some pdf with theory where i'm making mistake while switching to different measure space? eg. from $P$ to (implicitly) lebesgue? Or link some of these ideas above to Analysis Course which i think i understand more intuitively (at least when it comes to switching from integrals on manifolds to basic integrals)

RobPratt
  • 45,619

1 Answers1

1

One way to go about this is to find the conditional distribution first (if possible) and hence find the conditional expectation. This avoids issues like division by zero since as you say $P(X+Y=m)=0$ for any real $m$.

Here is a pedestrian way of solving the particular problem you have stated:

Note that $X+Y=\min\{X,Y\}+\max\{X,Y\}=Z+W$ where $W=\max\{X,Y\}$.

Assuming $\lambda$ is the rate parameter, joint density of $(Z,W)$ is (see this answer for details)

\begin{align} f_{Z,W}(z,w)&=2f_X(z)f_Y(w)\mathbf1_{z<w} \\&=2 \lambda^2 e^{-\lambda(z+w)}\mathbf1_{0<z<w} \end{align}

Suppose we change variables $(z,w)\mapsto (u,v)$ where $u=z$ and $v=z+w$.

This implies $z=u,w=v-u$ and the jacobian of transformation is $$J=J\left(\frac{z,w}{u,v}\right)=\det\begin{bmatrix}\frac{\partial z}{\partial u}&\frac{\partial z}{\partial v} \\ \frac{\partial w}{\partial u}& \frac{\partial w}{\partial v}\end{bmatrix}=1$$

So joint density of $(U,V)=(Z,Z+W)$ is

\begin{align} f_{U,V}(u,v)&=f_{Z,W}(u,v-u)|J| \\&=2\lambda^2 e^{-\lambda v}\mathbf1_{0<u<v-u} \\&=2\lambda^2 e^{-\lambda v}\mathbf1_{0<u<v/2} \end{align}

Now we know that $X+Y=V$ has density

$$f_V(v)=\lambda^2 v e^{-\lambda v}\mathbf1_{v>0}$$

Therefore conditional density of $U$ given $V$ is

\begin{align} f_{U\mid V}(u\mid v)&=\frac{f_{U,V}(u,v)}{f_V(v)} \\&=\frac{2}{v}\mathbf1_{0<u<v/2} \end{align}

In other words, distribution of $U=\min\{X,Y\}$ given $X+Y=V=v$ is uniform on $\left(0,\frac{v}{2}\right)$.

This yields $$E\left[U\mid V\right]=\frac{V}{4}$$

StubbornAtom
  • 17,052