I just encountered the following counterexamle that should illustrate that marginal convergence in distribution does not imply joint convergence in distribution:
Let U and V be independent standard normal random variables and define \begin{equation}X_n = U\end{equation} and $$Y_n=(−1)^n \cdot \frac{1}{2} \cdot U+ \sqrt{\frac{3}{4}}\cdot V$$ Then $X_n$ and $Y_n$ are both standard normal for all $n$, and hence trivially converge in law marginally. But $$\text{cov}(X_n, Y_n) = (−1)^n $$ for all $n$, and so the sequence $(X_n, Y_n)$ of random vectors cannot converge in law.
Here is my question (and I feel that the answer is straightforward, I just can't see it): why does the alternating covariance violate convergence of distribution of the vector $(X_n,Y_n)$?
Many thanks for any help, it is much appreciated. And sorry again if the solution is straighforward...