Let $(\Omega, \mathcal{H},\mathbb{P})$ a probability space. Suppose $X : \Omega \to \mathbb{R}$ a random variable. Is it always possible to introduce an additional RV $Y: \Omega \to \mathbb{R}$ where $X,Y$ are independent and identically distributed?
This issue comes up in a proof in Cinlar's Probability and Stochastics, where essentially, we want to show that $Var(X)< \infty$ provided a certain property $P$ of $X$ holds. We first show it assuming $\mathbb{E}(X) =0 $, then show it for all integrable $X$ by supposing $Y$ is some RV iid with $X$ and then show that if we have $P(X)$, we also have $P(X-Y)$ (this is a consequence of how $P$ is defined and relies on the independence of $X$ and $Y$), and then $\mathbb{E}(X-Y) = 0$, so $2 Var(X) = Var(X-Y) < \infty$.
But there seems to be an issue in doing this: how do we know such a $Y$ exists? E.g. suppose that our probability space is $\Omega = (0,1), \mathcal{H} = \mathcal{B}_{(0,1)}, \mathbb{P} = \lambda$, where $\lambda$ is Lebesgue on $(0,1)$. Then let $X : \Omega \to \mathbb{R}$ be given by the identity. Then $\sigma(X)$, the $\sigma$-algebra generated by $X$, is $\mathcal{B}_{(0,1)}$. Thus the only $\sigma$-algebras independent of it are trivial, i.e., they only can have events with probability $0$ or $1$. So the question is, how do we justify the above procedure, i.e., how can we introduce new iid RVs?
Note the what Cinlar does is even worse, since he has some sequence $\{X_n\}$ (of independent RVs), and he wants to introduce a new sequence $\{Y_n\}$ independent of $\{X_n\}$ and so that $Y_j$ has the same distribution as $X_j$.