7

Suppose $X_1,\ldots, X_n$ are (independent) RVs. What does it mean to say that $X_1',\ldots, X_n'$ is an independent copy of $X_1,\ldots, X_n$?

Does it mean that each $X_i'$ is independent of $X_i$ or does it mean that the joint distribution of $(X_1,\ldots, X_n)$ is the same as the joint distribution of $(X_1',\ldots, X_n')$? Or does it mean something else entirely?

I find the term a bit confusing since I am not sure how you can be both independent and a copy (since being a copy would imply being dependent).

  • Joint distribution is the same and $(X_1',\dots,X_n')$ is independent of $(X_1,\dots,X_n)$. – zhoraster Oct 07 '16 at 05:34
  • What is it mean for a distribution (function) to be independent of another? – The Substitute Oct 07 '16 at 06:37
  • 2
    Ok, let is put me other way around. $(X_1',\dots,X_n')$ is independent of $(X_1,\dots,X_n)$ and joint distribution is the same. Can you see which "is" corresponds to which word now? – zhoraster Oct 07 '16 at 07:12

1 Answers1

3

Consider the case when $n=1$. When we say that $X'$ is an independent copy of $X$ what we mean is that the distribution of $X'$ is the same as the distribution of $X$ AND that $X$ and $X'$ are independent. (you can use any other symbol for $X'$). This expression is usually used when you want to repeat the same exact statistical experiment number of times independently. So, people tend to use the same symbol with a prime or tilde.

For example one use for Monte Carlo simulations,is to approximate the expectation of a function $\varphi(X_1, X_1, \dots, X_n)$ in which the joint random variable \begin{equation} (X_1, X_1, \dots, X_n) \sim P_X \end{equation}

To do this, we need the realizations of number of independent copies of $(X_1, X_1, \dots, X_n)$. In your question you only have 2 copies. In Monte Carlo methods, we should use as many copies as possible. The estimator of the expectation is then given by

\begin{equation} { \hat{\mathbb{E}}}[\varphi(X_1, X_2, \dots, X_n)]: = \frac{1}{M} \sum_{m=1}^M \varphi(X^{(m)}_1, X^{(m)}_2, \dots, X^{(m)}_n) \end{equation} in which \begin{equation} X^{(m)}_1, X^{(m)}_2, \dots, X^{(m)}_n \quad \text{for all } m \in \{1, \dots ,M\}\backslash i \quad \text{for some } i \in \{1, \dots ,M\} \end{equation} are independent copies of a random variable \begin{equation} X^{(i)}_1, X^{(i)}_2, \dots, X^{(i)}_n \sim P_X. \end{equation}

Observe that \begin{equation} { \hat{\mathbb{E}}}[\varphi(X_1, X_2, \dots, X_n)] \end{equation} is a random variable that depends on the $M$ copies. Once you realize all the copies, you get an estimate of the expectation.

user144410
  • 1,312
  • Is, according to this terminology, being $X'$ an independent copy of $X$ the same as $X$ and $X'$ being iid? I think so, from your definition. – xyz Nov 03 '22 at 16:17