2

Let $X_1, X_2, ...$ and $Y_1, Y_2, ...$ be sequences of random variables such that

$$ X_n \stackrel{d}{\rightarrow} X \quad\text{and}\quad Y_n \stackrel{d}{\rightarrow} Y \quad\text{as}\quad n\rightarrow \infty. $$

Suppose further that $X_n$ and $Y_n$ are independent for all $n$. Is it possible to find such sequences so that $X$ and $Y$ are dependent?

Context: Up to the last sentence, this is Theorem 6.6.6 in Allan Guts "An Intermediate Course in Probability". He makes a point of requiring also that $X$ and $Y$ are independent, before the conclusion of the theorem that $X_n + Y_n \stackrel{d}{\rightarrow} X+Y$ as $n\rightarrow\infty$. So far as I can see, it is not explicitly used in the proof (using characteristic functions). Anyhow, I was mostly curious to see an example where independence is introduced only in the limit.

Mars Plastic
  • 4,239
Daniel
  • 171
  • 6

2 Answers2

3

Yes, this is possible.

Since you are only interested in convergence in distribution, $X_n\to X$ implies that $X_n\to \tilde X$ whenever $\mathcal L(\tilde X)=\mathcal L(X)$. Hence, independence is not an issue at all.

For a specific example, just consider $X_n,Y_n \sim \mathcal N(0,1)$ iid and then set $X=X_1$ and $Y=-X_1$.

Note however, that the theorem you quote is about a more specific situation. There, the assumption that $X$ and $Y$ are independent cannot be dropped, as in the example above $X_n+Y_n\to X+Y=0$ is clearly false.

Mars Plastic
  • 4,239
3

The confusion is probably due to the difference between the joint and marginal convergence modes. If we knew that $$ (X_n,Y_n)\xrightarrow{d}(X,Y) $$ as a vector of independent random variables, then $X$ and $Y$ would be also independent. Indeed, considering characteristic functions, we are given that $$ \varphi_{(X_n,Y_n)}(t,s)\to\varphi_{(X,Y)}(t,s)\label{1}\tag{1} $$ for all $t,s\in\mathbb{R}$. On the other hand, $$ \varphi_{(X_n,Y_n)}(t,s)=\mathsf{E}e^{itX_n}\mathsf{E}e^{isY_n}\to \mathsf{E}e^{itX}\mathsf{E}e^{isY}=\varphi_{X}(t)\varphi_Y(s).\label{2}\tag{2} $$ Thus, $X$ and $Y$ are independent (see, e.g., this question).

If we know only that the marginal distributions converge, i.e., $X_n\xrightarrow{d}X$ and $Y_n\xrightarrow{d}Y$, then we have $\eqref{2}$ but not $\eqref{1}$. This means that we can pick independent $X$ and $Y$ as the limiting random variables. However, as the example in the other answer shows it need not be the case.

  • Thanks. I was thinking in terms of the limit of $X_n$ and $Y_n$ becoming dependent rather than having the choice to pick X and Y as a limit. That got me confused. – Daniel May 26 '21 at 14:43