41

If we have a sequence of random variables $X_1,X_2,\ldots,X_n$ converges in distribution to $X$, i.e. $X_n \rightarrow_d X$, then is $$ \lim_{n \to \infty} E(X_n) = E(X) $$ correct?

I know that converge in distribution implies $E(g(X_n)) \to E(g(X))$ when $g$ is a bounded continuous function. Can we apply this property here?

Davide Giraudo
  • 172,925
  • 11
    "Can we apply this property here?" No, because $g(\cdot)$ would be the identity function, which is not bounded. – leonbloy Jun 03 '12 at 15:59

2 Answers2

45

With your assumptions the best you can get is via Fatou's Lemma: $$\mathbb{E}[|X|]\leq \liminf_{n\to\infty}\mathbb{E}[|X_n|]$$ (where you used the continuous mapping theorem to get that $|X_n|\Rightarrow |X|$).

For a "positive" answer to your question: you need the sequence $(X_n)$ to be uniformly integrable: $$\lim_{\alpha\to\infty} \sup_n \int_{|X_n|>\alpha}|X_n|d\mathbb{P}= \lim_{\alpha\to\infty} \sup_n \mathbb{E} [|X_n|1_{|X_n|>\alpha}]=0.$$ Then, one gets that $X$ is integrable and $\lim_{n\to\infty}\mathbb{E}[X_n]=\mathbb{E}[X]$.

As a remark, to get uniform integrability of $(X_n)_n$ it suffices to have for example: $$\sup_n \mathbb{E}[|X_n|^{1+\varepsilon}]<\infty,\quad \text{for some }\varepsilon>0.$$

G dos Reis
  • 579
  • 4
  • 5
  • Clarification: my comment doesn't really answer the original question since Fatou is employed -- I've additionally assumed convergence of random variables. The original question only assumes convergence in distribution for which the random variables need not be defined on a common probability space. Thank you, Andrei V. Zorine (Lobachevsky University of Nizhni Novgorod, Russia) for pointing this out. – G dos Reis Jan 31 '23 at 17:25
28

Try $\mathrm P(X_n=2^n)=1/n$, $\mathrm P(X_n=0)=1-1/n$.

Did
  • 279,727
  • Could you please give a bit more explanation? – wij Oct 10 '15 at 14:25
  • @WittawatJ. About what? Please explain your problem. – Did Oct 11 '15 at 23:57
  • 1
    So in the limit $X_n$ becomes a point mass at 0, so $\lim_{n\to\infty} E(X_n) = 0$. Then $E(X) = 0$. I don't see a problem? – Joseph Garvin Nov 18 '18 at 02:46
  • 1
    Answering my own question: $E(X_n) = (1/n)2^n + (1-1/n)0 = (1/n)2^n$. Then taking the limit the numerator clearly grows faster, so the expectation doesn't exist. This begs the question though if there is example where it does exist but still isn't equal? – Joseph Garvin Nov 18 '18 at 02:53
  • @JosephGarvin Of course there is, replace $2^n$ by $7n$ in the example of this answer. – Did Nov 18 '18 at 10:13
  • I see. Am I right to think $E(\lim_{n\to\infty}X_n) = E(X)$ though? Since two variables with the same CDF must have the same expectation. – Joseph Garvin Nov 18 '18 at 20:53
  • @JosephGarvin Sorry but what is your assumption? That $X_n\to X$ in distribution or that $X_n\to X$ almost surely? 'Cause nobody assumed the latter while you seem to do... – Did Nov 18 '18 at 21:12
  • @Did the former. Expectation only depends on the CDF, which if $X_n \to X$ in distribution by definition means $X_n$ in the limit becomes a random variable with an equal CDF, right? – Joseph Garvin Nov 18 '18 at 21:14
  • @JosephGarvin You say you only assume the former -- but then you introduce $\lim\limits_{n\to\infty}X_n$, which assumes de facto the existence of the pointwise limit. So, no, not right at all, precisely for the reason explained in my previous comment. – Did Nov 18 '18 at 21:16