2

Let $h(U, V)$ be a measurable function of the random variables $U$ and $V$. How can I call this pseudo conditional expectation: $$\mathbb{E}(h(U, V)\lVert V) = \int h(u, V) dF_U(u),$$ where $F_U$ is the distribution function of $U$? Indeed, I would like to integrate out $U$, without considering the fact that $U$ and $V$ are potentially dependent. This is exactly the classical conditional expectation if $U$ and $V$ are independent.

Note that $\mathbb{E}(h(U, V)\lVert V)$ is not invariant to variable changes in $h(U, V)$. For example, let $h_1(U,V) = U + V$ and $h_2(W, V) = W + 2V$, where $W = U-V$ and $\mathbb{E}(V) = 0$. We have $h_1(U, V) = h_2(W, V)$, but $\mathbb{E}(h_1(U, V)\lVert V) = \mathbb{E}(U) + V$ is different from $\mathbb{E}(h_2(W, V)\lVert V) = \mathbb{E}(W) + 2V = E(U) + 2V$ for any $V\ne 0$.

Is there a name to call $\mathbb{E}(h(U, V)\lVert V)$ when $U$ and $V$ are not independent?

Ari.stat
  • 453

1 Answers1

1

Two things come to mind. Let $X,Y$ be real-valued rvs and let $P_X,P_Y$ be their (marginal) probability distributions on $(\mathbb{R},\mathscr{B}(\mathbb{R}))$.

In a measure theory setting, if $(x,y)\mapsto h(x,y)$ is Borel nonnegative or one of the three double integrals of $|h|$ is finite, then $x\mapsto \int h(x,y)P_Y(dy)$ and $y\mapsto \int h(x,y)P_X(dx)$ are both Borel functions. These usually appear as one of the results of the Tonelli-Fubini theorem(s), even for $\sigma$-finite measures so not necessarily probability measures.

The most straightforward purely probabilistic interpretation is indeed the conditional expectation of $h(\tilde{X},\tilde{Y})$ given $\tilde{Y}$ when $\tilde{X},\tilde{Y}$ are copies in distribution of $X,Y$ but which are mutually independent. Copies in distribution can appear for example when discussing convergence in distribution: if $X_n\to^d X$ and $Y_n\to^d Y$, then it is not true in general that $(X_n,Y_n)\to^d(X,Y)$, but if each $X_n$ is independent of $Y_n$, $\forall n$, then $(X_n,Y_n)\to^d(\tilde{X},\tilde{Y})$ where $\tilde{X},\tilde{Y}$ are respectively mutually independent copies of $X,Y$. Copies in distribution also appear when discussing estimation methods.

Snoop
  • 15,214