Suppose $X$ and $Y$ are two random variables on $(\Omega, \mathscr{F}, P)$. $X\in \mathscr{G}$, and $Y$ independent from $\mathscr{G}$, where $\mathscr{G}$ is a sub-sigma-algebra.
Consider $E[f(X,Y)|\mathscr{G}]$. Assume $f$ is a Borel-measurable function. Do we have this: \begin{align*} E[f(X,Y)|\mathscr{G}] &= E[f(x,Y)]|_{x=X}\\ &=\int_{-\infty}^{\infty} f(X,y)\mathrm{d}F_{Y}(y) \end{align*} In the case when $f(x,y)$ can be written as $g(x)h(y)$, I know the result is right, since we have: $$ E[f(X,Y)|\mathscr{G}]=E[g(X)h(Y)|\mathscr{G}]=g(X)E[h(Y)|\mathscr{G}]=g(X)E[h(Y)] $$ In the general case, I guess it is right. I have checked this case:
$Y$ have the distribution: $P(Y=1)=p$ and $P(Y=-1)=1-p$, and $|X|<\epsilon<1$ with probability $1$ and with $\epsilon$ a constant. I calculate $E[\ln(1+XY)|\mathscr{G}]$ using two approaches, the first being Taylor expansion and switch the order of expectation and sum, and the second being the formula above which I guess is right. But I cannot prove that formula. How to prove that formula when $f$ cannot be decomposed as a product of two functions?
Thank you!