3

The claim in the title seems very plausible since the characteristic function "characterizes" or determines the distribution of $X$, but I don't know how to derive it. There is a similar result for the characteristic functions of two Random variables, eg here, but I'm not sure if it could be deduced from that.

Any help would be appreciated!

2 Answers2

5

If $Z$ is any bounded function measurable with respect to $\mathcal F$ then $Z$ is a uniform limit of simple functions measurable with respect to $\mathcal F$. From this it follows that $Ee^{i\langle x, X \rangle } e^{i\langle y, Y \rangle } =Ee^{i\langle x, X \rangle } Ee^{i\langle y, Y \rangle}$ for all $x,y$ and for all $\mathcal F$ measurable random variable $Y$. It follows that $X$ is independent of $Y$ for all $\mathcal F$ measurable random variable $Y$.

  • If $Z_n$ is $\frac {i-1} {2^{n}}$ when $Z \in [\frac {i-1} {2^{n}},\frac i {2^{n}})$ then $|Z_n-Z|\leq \frac i {2^{n}}$ so $Z$ is uniform limit of $Z_n$'s. Since $Z$ is bounded the inequalities $Z \in [\frac {i-1} {2^{n}},\frac i {2^{n}})$ hod only for finite number of values of $i$ so each $Z_n$ is a simple function. – Kavi Rama Murthy Mar 02 '19 at 11:49
  • In your aswer you mean that since $e^{i,<s,1_F>}=1_Fe^{is}$ we have your equation for indicator functions and then for this generalizes to simple functions etc? – MrFranzén Mar 02 '19 at 12:24
  • @MrFranzén Yes, exactly. First take linear combinations and then take limits. – Kavi Rama Murthy Mar 02 '19 at 12:26
  • Sorry for my previous comment (seems you have responded to it), I overlooked "bounded" somehow. – zhoraster Mar 02 '19 at 12:27
1

Reading my own post one year later I would supply the following answer. From Kac's theorem, to prove independence of $X $ and $1_F $ it would be sufficient to prove that for any $s,t \in \mathbb R^d $ we have that

$$E[e^{i \langle (X, 1_F), (s,t) \rangle}]= E[e^{i \langle X , s \rangle } ]E[e^{i \langle 1_F , t \rangle } ] $$

But since $e^{i \langle 1_F , t \rangle } = 1_{F^C } + 1_F e^{i\sum_{i=1} ^d t_i} $ and $e^{i \langle (X, 1_F), (s,t) \rangle}= 1_{F^C } e^{i \langle X, s \rangle} + 1_F e^{i \langle X, s \rangle}e^{i\sum_{i=1} ^d t_i} $ - from the fact that expectation is additive and we may "move out" constants proving the condition in the title of the question would be sufficient.