1

Let $X,Y,Z$ be integrable random variables on the same probability space.

If $X$ is independent of $Y$ conditionally on $Z$, i.e, $X|Z$ is independent of $Y|Z$, we know that for a measurable function $f$, we also have $f(X)|Z$ is independent of $f(Y)|Z$.

Under what extra conditions do we have $f(X)|f(Z)$ is independent of $f(Y)|f(Z)$ ?

I imagine that it holds if $f$ is injective since $\sigma(Z)=\sigma(f(Z))$ in that case. Is there a sufficient weaker condition that exploits the fact that we apply $f$ to all the random variables?

EDIT 4: Let $W$ be another integrable random variable on the same probability space. I am particularly interested in the case $E[W|X]$ is conditionally independent of $E[W|Y]$ given $E[W|Z]$. So $f$ would be the conditional expectation here. I am hoping whatever conditions we find on $f$ would translate into conditions on $W$ so that $E[W|X]$ is conditionally independent of $E[W|Y]$ given $E[W|Z]$ holds when $X$ is conditionally independent of $Y$ given $Z$.

See my other related questions: When is $\sigma(E[X|\mathcal F]) \subset \sigma(E[Y|\mathcal F])$? and A martingale and mesurability problem

EDIT 3: What follows is false (thanks to Nate's comments and to question Is "conditional independence" of $\sigma$-algebras implied by "set-wise conditional independence" of $\sigma$-algebras?) and is not worth reading. The reader can focus on the above.

EDIT 1: $X$ is independent of $Y$ conditionally on $Z$ iff for all $A \in \sigma(X), B\in \sigma(Y), C \in \sigma(Z)$, $P(A\cap B | C)= P(A|C) P(B|C)$. [NOTE: THIS IS FALSE, the LHS is actually stronger than the RHS, see https://math.stackexchange.com/questions/3410023/is-conditional-independence-of-sigma-algebras-implied-by-set-wise-conditio?noredirect=1&lq=1 ] And $f$ is mesurable, $\sigma(f(\cdot)) \subseteq \sigma(\cdot)$.

Hence, for all $A \in \sigma(f(X)), B\in \sigma(f(Y)), C \in \sigma(f(Z))$, $P(A\cap B | C)= P(A|C) P(B|C)$. So $f(X)|f(Z)$ is independent of $f(Y)|f(Z)$. Can someone confirm ?

EDIT 2: I think the above equivalence is wrong [Yes, it was, see the above link]. The correct one should be $X$ is independent of $Y$ conditionally on $Z$ iff for all $A \in \sigma(X), B\in \sigma(Y), \exists C \in \sigma(Z)$ such that $P(A\cap B | C)= P(A|C) P(B|C)$. [NOTE: THIS IS ALSO FALSE, see Nate's comment below]

So what needs to be shown is that if $A \in \sigma(f(X)), B\in \sigma(f(Y))$, there is a $C$ in $\sigma(f(Z))$ such that $P(A\cap B | C)= P(A|C) P(B|C)$.

W. Volante
  • 2,244
  • I'm pretty sure no such statement can be true. Without loss of generality, suppose that $X, Y, Z$ have disjoint ranges. Take $f$ to be injective on the ranges of $X$ and $Y$, and constant on the range of $Z$, so that $\sigma(f(X)) = \sigma(X)$, $\sigma(f(Y))=\sigma(Y)$, and $\sigma(f(Z))$ is trivial. Then either of your statements would "prove" that conditional independence given $Z$ implies unconditional independence, which is just false. – Nate Eldredge Feb 08 '22 at 06:52
  • In general, conditional independence is not preserved when making the conditioning $\sigma$-field either larger or smaller. In particular making all three $\sigma$-fields smaller will not improve matters. The requirement that all three be reduced using the same function $f$ is no restriction at all, because as noted, $f$ can effectively act as any three different functions on the different random variables. – Nate Eldredge Feb 08 '22 at 07:17
  • @NateEldredge So if I understand you correctly, you claim the following. Let $X$ be independent of $Y$ conditionally on $Z$. Let $f$ be a real measurable function. If $f$ is not injective, then we cannot have $f(X)|f(Z)$ is independent of $f(Y)|f(Z)$. Do you have a proof? – W. Volante Feb 08 '22 at 15:08
  • @NateEldredge Also, what are your thoughts on the equivalence: $X$ is independent of $Y$ conditionally on $Z$ iff for all $A \in \sigma(X), B\in \sigma(Y), \exists C \in \sigma(Z)$ such that $P(A\cap B | C)= P(A|C) P(B|C)$. The RHS is a weaker version of https://math.stackexchange.com/questions/3410023/is-conditional-independence-of-sigma-algebras-implied-by-set-wise-conditio?noredirect=1&lq=1 – W. Volante Feb 08 '22 at 15:10
  • I'm not saying that we cannot have independence, but that we need not. You can basically adapt the example in the second half of this answer. (By the way, I think your notation "$f(X)|f(Z)$ independent of $f(Y)|f(Z)$" is confusing, as "$f(X)|f(Z)$" is not really a mathematical object in itself. The normal wording would just be "$f(X)$ and $f(Y)$ are conditionally independent given $f(Z)$.) – Nate Eldredge Feb 08 '22 at 15:11
  • As I mentioned in my previous comment, that "equivalence" is not correct. Take any example where $X,Y$ are (unconditionally) independent, but are not conditionally independent given $Z$ (like the first example in my previously linked answer). Then taking $C = \Omega$, which is certainly in $\sigma(Z)$, would cause the right side of your "equivalence" to hold. – Nate Eldredge Feb 08 '22 at 15:14
  • @NateEldredge I see, thank you! So the question of finding conditions on $f$ weaker than injective/bijective remains open? Since need not and cannot are not the same, I guess there must be some set of conditions under which we have conditional independence? – W. Volante Feb 08 '22 at 15:26
  • My feeling is that looking for conditions on $f$ is hopeless. If $f$ is injective or constant then the claim is trivial, and in all other cases, where $f$ separates at least two values and merges two others, you can construct $X,Y,Z$ so as to form a version of my previous counterexample. – Nate Eldredge Feb 08 '22 at 15:29
  • @NateEldredge I see, I added a particular case to maybe simplify the problem. – W. Volante Feb 08 '22 at 15:48
  • I feel like we are going around and around some deeper misconception about conditional expectation and conditional independence, but I can't tell what it might be. – Nate Eldredge Feb 08 '22 at 16:25
  • @NateEldredge $E[W|Z],E[W|X],E[W|Y]$ are functions of $Z,X,Y$ respectively. The mapping associating a random variable $U$ to $E[W|U]$ where $W$ is fixed is the $f$ I am interested in. – W. Volante Feb 08 '22 at 16:54
  • I see, I misread. But still, it's not the same $f$. That is, if $E[W|U] = f(U)$, it is not usually true that $E[W|X] = f(X)$ for the same function $f$. – Nate Eldredge Feb 08 '22 at 16:56
  • @NateEldredge Interesting, so this means that even solving this question about $f$ would not help me with my conditional expectations. My last hope is then the 2 other questions I linked, especially the one about the martingale. – W. Volante Feb 08 '22 at 17:03

1 Answers1

1

Too long for a comment.

By definition, $\mathscr{F}$ and $\mathscr{F}'$ are independent given $\mathscr{H}$ if for every pair of events $E, E'$ in $\mathscr{F}, \mathscr{F}',$ respectively, we have $$ P(E \cap E' \mid \mathscr{H}) = P(E \mid \mathscr{H}) P(E' \mid \mathscr{H}). $$ This means that for every event $H \in \mathscr{H},$ we have $$ P(E \cap E' \cap H) = \int\limits_H P(E \mid \mathscr{H}) P(E' \mid \mathscr{H}) dP, $$ this is quite different to what you have as "definition" in your EDIT 1.

Suppose $\mathscr{H} = \sigma(Z)$ and let $Z' = f(Z),$ $\mathscr{H}' = \sigma(Z').$ Clearly, $\mathscr{H}' \subset \mathscr{H}.$ You want to prove that for every $E, E'$ as before and $H'$ in $\mathscr{H}',$ the following equality occurs $$ P(E \cap E' \cap H') = \int\limits_{H'} P(E \mid \mathscr{H}') P(E' \mid \mathscr{H}') dP. $$ There does not seem to be a direct connection between this and what you want. (Note that when you actually have $\mathscr{F} = \sigma(X)$ and $\mathscr{F}' = \sigma(X')$ then you can prove that $f(X)$ and $f'(X')$ are independent given $\mathscr{H}$ easily since $\{f(X) \in A\}$ and $\{f'(X') \in A'\}$ belong to $\mathscr{F}$ and $\mathscr{F}',$ respectively, so your argument of restriction works smoothly.)

William M.
  • 7,532