1

Let $(\Omega, \mathcal{H},\mathbb{P})$ a probability space. Suppose $X : \Omega \to \mathbb{R}$ a random variable. Is it always possible to introduce an additional RV $Y: \Omega \to \mathbb{R}$ where $X,Y$ are independent and identically distributed?

This issue comes up in a proof in Cinlar's Probability and Stochastics, where essentially, we want to show that $Var(X)< \infty$ provided a certain property $P$ of $X$ holds. We first show it assuming $\mathbb{E}(X) =0 $, then show it for all integrable $X$ by supposing $Y$ is some RV iid with $X$ and then show that if we have $P(X)$, we also have $P(X-Y)$ (this is a consequence of how $P$ is defined and relies on the independence of $X$ and $Y$), and then $\mathbb{E}(X-Y) = 0$, so $2 Var(X) = Var(X-Y) < \infty$.

But there seems to be an issue in doing this: how do we know such a $Y$ exists? E.g. suppose that our probability space is $\Omega = (0,1), \mathcal{H} = \mathcal{B}_{(0,1)}, \mathbb{P} = \lambda$, where $\lambda$ is Lebesgue on $(0,1)$. Then let $X : \Omega \to \mathbb{R}$ be given by the identity. Then $\sigma(X)$, the $\sigma$-algebra generated by $X$, is $\mathcal{B}_{(0,1)}$. Thus the only $\sigma$-algebras independent of it are trivial, i.e., they only can have events with probability $0$ or $1$. So the question is, how do we justify the above procedure, i.e., how can we introduce new iid RVs?

Note the what Cinlar does is even worse, since he has some sequence $\{X_n\}$ (of independent RVs), and he wants to introduce a new sequence $\{Y_n\}$ independent of $\{X_n\}$ and so that $Y_j$ has the same distribution as $X_j$.

  • 1
    A counter-example is below. However, the idea is that if you have a system with a random variable $X$, and you want to consider another independent random variable $Y$, you just extend the probability space to allow that independent source of randomness. – Michael Dec 24 '20 at 18:43
  • As long as your probability space allows a uniformly distributed variable $U \sim Uniform([0,1])$ that is independent of some other collection of variables ${X_n}$ that you have, as deterministic functions of $U$ you can construct an infinite sequence of i.i.d. variables ${U_i}_{i=1}^{\infty}$, all with whatever distribution you want, all independent of ${X_n}$. – Michael Dec 24 '20 at 18:45
  • I will add that extension techniques are standard method. See, e.g., Hermann Thorisson, "Coupling, Stationarity, and Regeneration", chapter 3. – Botnakov N. Dec 24 '20 at 18:49
  • I don't know the reference but I suppose that what was meant is that it is possible to construct a probability space where rv are iid; see https://math.stackexchange.com/questions/250145/existence-of-iid-random-variables – Syd Dec 26 '20 at 00:24

1 Answers1

2

Suppose $\Omega = \{1, 2 \}$, $\mathcal{F} = 2^{\Omega}$, $P(1) = P(2) = \frac12$.

Put $X(\omega) = \omega$. It's easy to see that this is counterexample.

Indeed, suppose that $Y$ has the same distribution and $X$ and $Y$ are independent. Then $P(X = 1, Y =1)$ must be equal to $\frac{1}{4}$, but there's no any event in $\mathcal{F}$ with such probability.

Botnakov N.
  • 5,660