Consider the case when $n=1$. When we say that $X'$ is an independent copy of $X$ what we mean is that the distribution of $X'$ is the same as the distribution of $X$ AND that $X$ and $X'$ are independent. (you can use any other symbol for $X'$). This expression is usually used when you want to repeat the same exact statistical experiment number of times independently. So, people tend to use the same symbol with a prime or tilde.
For example one use for Monte Carlo simulations,is to approximate the expectation of a function $\varphi(X_1, X_1, \dots, X_n)$ in which the joint random variable
\begin{equation}
(X_1, X_1, \dots, X_n) \sim P_X
\end{equation}
To do this, we need the realizations of number of independent copies of $(X_1, X_1, \dots, X_n)$. In your question you only have 2 copies. In Monte Carlo methods, we should use as many copies as possible. The estimator of the expectation is then given by
\begin{equation}
{ \hat{\mathbb{E}}}[\varphi(X_1, X_2, \dots, X_n)]: = \frac{1}{M} \sum_{m=1}^M \varphi(X^{(m)}_1, X^{(m)}_2, \dots, X^{(m)}_n)
\end{equation}
in which
\begin{equation}
X^{(m)}_1, X^{(m)}_2, \dots, X^{(m)}_n \quad \text{for all } m \in \{1, \dots ,M\}\backslash i \quad \text{for some } i \in \{1, \dots ,M\}
\end{equation}
are independent copies of a random variable
\begin{equation}
X^{(i)}_1, X^{(i)}_2, \dots, X^{(i)}_n \sim P_X.
\end{equation}
Observe that
\begin{equation}
{ \hat{\mathbb{E}}}[\varphi(X_1, X_2, \dots, X_n)]
\end{equation}
is a random variable that depends on the $M$ copies. Once you realize all the copies, you get an estimate of the expectation.