Let $X_1, X_2, ...$ and $Y_1, Y_2, ...$ be sequences of random variables such that
$$ X_n \stackrel{d}{\rightarrow} X \quad\text{and}\quad Y_n \stackrel{d}{\rightarrow} Y \quad\text{as}\quad n\rightarrow \infty. $$
Suppose further that $X_n$ and $Y_n$ are independent for all $n$. Is it possible to find such sequences so that $X$ and $Y$ are dependent?
Context: Up to the last sentence, this is Theorem 6.6.6 in Allan Guts "An Intermediate Course in Probability". He makes a point of requiring also that $X$ and $Y$ are independent, before the conclusion of the theorem that $X_n + Y_n \stackrel{d}{\rightarrow} X+Y$ as $n\rightarrow\infty$. So far as I can see, it is not explicitly used in the proof (using characteristic functions). Anyhow, I was mostly curious to see an example where independence is introduced only in the limit.