Assume we have two sequences $X_1,X_2\ldots$ and $Y_1,Y_2,\ldots$ which are all independent random variables. Assume the partial sums $\sum\limits_{k=1}^N X_k$ and $\sum\limits_{k=1}^N Y_k$ do converge in probability to some RVs $X$ and $Y$. Are then $X$ and $Y$ also independent?
Is their a proof of this that purely relies on the definition of independence of $\sigma$-Algebra's without using characteristic functions ect.