1

I'm trying to show that if $\lim_{n\rightarrow\infty}E(|X_n-X|^p)=0$ for some $p>0$ then $X_n\rightarrow X$ in probability.

I tried a few directions.

  1. By Jensen's inequality, $E(|X_n-X|^p)\geq (E|X_n-X|)^p$ because $|X|^p$ is convex (for $p\geq 1$, that's why I think this is not helpful).

  2. By Markov's inequality, $P\{|X_n-X|\geq\epsilon\}\leq\frac{E|X_n-X|}\epsilon$. If $p=1$, I get what I need because I can say this goes to zero when $n$ approaches infinity.

Any ideas of how to preceed?

saz
  • 120,083
Whyka
  • 1,953

2 Answers2

2

Markov inequality is a good idea. A more general version of Markov's inequality states that

$$\mathbb{P}(|Y| \geq \epsilon) \leq \frac{\mathbb{E}(|Y|^p)}{\epsilon^p}$$ for any $p \geq 1$ and any random variable $Y$.

To prove this inequality note that

$$\begin{align*} \mathbb{P}(|Y| \geq \epsilon) = \int 1_{\{|Y| \geq \epsilon\}} \, d\mathbb{P} &\leq \int \frac{|Y|^p}{\epsilon^p} 1_{\{|Y| \geq \epsilon\}} \, d\mathbb{P} \\ &\leq \frac{1}{\epsilon^p} \int |Y|^p \, d\mathbb{P}. \end{align*}$$

saz
  • 120,083
1

You can actually use Markov's property.

$$P(|X_n-X| \ge \epsilon) = P(|X_n-X|^p \ge \epsilon^p) \le \frac{E|X_n-X|^p}{\epsilon^p} \to 0.$$

angryavian
  • 89,882