27

Let $X$ and $Y$ be two independent identically distributed random variables with finite expectation $\Bbb{E}(X) = \Bbb{E}(Y) < \infty$. Prove that

$$\Bbb{E}(|X-Y|) \le \Bbb{E}(|X+Y|)$$

I think that this inequality may follow somehow from Jensen's inequality, but I failed to use it here. Or maybe it is worth considering an expression $|x+y|-|x-y|$ and making use of some of its properties?

I am interested to see a proof of this fact or some favorable ideas that may help here. Any suggestions would be greatly appreciated.

Ramil
  • 1,882

3 Answers3

36

By a simple $u$-substitution, we find that

$$ \int_{-\infty}^{\infty} \frac{1-\cos(at)}{t^2} \, \mathrm{d}t = C|a|, \tag{1}$$

where $C = \int_{-\infty}^{\infty} \frac{1-\cos t}{t^2} \, \mathrm{d}t$ is a positive, finite constant. (It can be shown that $C = \pi$, although the exact value of $C$ plays no role in this solution.)

Note that the integrand of $\text{(1)}$ is non-negative. Taking advantage of this fact, we can apply Tonelli's theorem to find that, for any real-valued random variable $Z$, the following identity holds:

$$ \Bbb{E}[|Z|] = \frac{1}{C} \Bbb{E}\left[ \int_{-\infty}^{\infty} \frac{1-\cos(Zt)}{t^2} \, \mathrm{d}t \right] = \frac{1}{C} \int_{-\infty}^{\infty} \frac{1-\Bbb{E}[\cos(Zt)]}{t^2} \, \mathrm{d}t $$

Therefore

\begin{align*} \Bbb{E}[|X+Y| - |X-Y|] &= \frac{1}{C} \int_{-\infty}^{\infty} \frac{\Bbb{E}[\cos((X-Y)t)-\cos((X+Y)t)]}{t^2} \, \mathrm{d}t \\ &= \frac{1}{C} \int_{-\infty}^{\infty} \frac{\Bbb{E}[2\sin(Xt)\sin(Yt)]}{t^2} \, \mathrm{d}t \\ &= \frac{1}{C} \int_{-\infty}^{\infty} \frac{2\Bbb{E}[\sin(Xt)]^2}{t^2} \, \mathrm{d}t \\ &\geq 0. \tag{2} \end{align*}

Moreover, the equality holds for $\text{(2)}$ if and only if $\Bbb{E}[\sin(Xt)] = 0$ for all $t$. This implies that the characteristic function $\varphi_X(t) = \Bbb{E}[e^{itX}]$ is real-valued, which in turn is equivalent to the symmetry condition: $X \stackrel{d}{=} -X$.


Remark. Using a similar argument, we can show that:

Theorem. Let $p \in (0, 2]$, and let $X$ and $Y$ be i.i.d. $L^p$-random variables. Then

$$\mathbb{E}[|X+Y|^p] \geq \mathbb{E}[|X-Y|^p]. $$

Moreover, the equality holds if and only if $X \stackrel{d}{=} -X$.

Sangchul Lee
  • 167,468
  • 1
    Wow! You did it again! Could you please reveal how on earth did you come up with such a solution? It doesn't seem to be quite natural. What is the intuition behind such solutions? – Ramil Apr 25 '17 at 13:26
  • 2
    Can this argument be extended to $E[|X+Y|^p] -E[|X-Y|^p] $ ? – Boby Apr 25 '17 at 13:31
  • 2
    @Ramil, As in the previous problem, I thought that it would be nice to have a representation that allows to split $|X-Y|$. Among integrals that I know, $\text{(1)}$ seemed useful for my purpose, so I tried it and luckily it worked :) – Sangchul Lee Apr 25 '17 at 13:32
  • 4
    @Boby, The argument readily extends to $p \in (0, 2)$ from the identity $$\int_{0}^{\infty} \frac{1-\cos(at)}{t^{1+p}} , dt = \frac{\pi}{2 \Gamma(1+p)\sin(\pi p/2)} |a|^p. $$ The case $p = 0, 2$ are just algebra. For $p > 2$, I am not sure even the inequality remains true. – Sangchul Lee Apr 25 '17 at 13:39
  • 3
    Great. Thank you. It is interesting that it holds for $p<1$. Since, $|a|^p$ for $p<1$ no longer induces a norm. – Boby Apr 25 '17 at 13:46
  • 2
    @Boby It is not true in some cases. Say, for $p=4$ we have $(X+Y)^4 - (X-Y)^4 = 12X^3Y + 12Y^3X$, so $\Bbb{E}((X+Y)^4 - (X-Y)^4) = 24\Bbb{E}(X^3)\Bbb{E}(Y)$ and, for example, for $X, Y$ that are always negative the expectation will be negative. – Ramil Apr 25 '17 at 13:50
  • 1
    @Boby, You're right. Even it is not entirely clear to me what kind of 'balancing' in joint distribution of $X-Y$ and $X+Y$ makes this happen. – Sangchul Lee Apr 25 '17 at 13:59
  • I suggest we put this as a question. What do you think? – Boby Apr 25 '17 at 14:00
  • @Boby, Sure, it is always nice to see different perspectives. – Sangchul Lee Apr 25 '17 at 14:06
  • @Ramil If $P(X\leqslant0)=1$ then $E(X^3)E(X)\geqslant0$ hence this is is not a counterexample. But it suffices that $E(X)$ and $E(X^3)$ have opposite signs, for example if $P(X=-1)=\frac34$, $P(X=2)=\frac14$, – Did May 08 '17 at 19:57
  • @Did Yes, you are right, thanks. – Ramil May 08 '17 at 20:53
10

Here's another argument. It doesn't seem to generalize to $p$ norms, but perhaps it is instructive anyway.

For independent variables $X,Y$ with finite expectation we can write

\begin{align*} \mathbb E|X-Y| &=\int_{-\infty}^\infty \mathbb P[X\leq t< Y]+\mathbb P[Y\leq t<X] dt\\ &=\int_{-\infty}^\infty F_X(t)(1-F_Y(t))+F_Y(t)(1-F_X(t)) dt\tag{1} \end{align*}

where $F_Z$ denotes the cumulative distribution function $F_Z(t)=\mathbb P[Z\leq t]$ of a random variable $Z.$

In the case at hand, $X$ and $Y$ are i.i.d., so $F_{-Y}(t)=1-F_X(-t).$ Using $-Y$ instead of $Y$ in (1) gives

$$\mathbb E|X+Y|=\int_{-\infty}^\infty F_X(t)F_X(-t)+(1-F_X(-t))(1-F_X(t)) dt\tag{2}$$

We can get a comparable integrand for $\mathbb E|X-Y|$ by substituting $t$ for $-t$ in the final term in (1), and using $F_Y=F_X$: $$ \mathbb E|X-Y|=\int_{-\infty}^\infty F_X(t)(1-F_X(t))+F_X(-t)(1-F_X(-t)) dt\tag{3} $$

Writing $a=F_X(t)$ and $b=F_X(-t),$ clearly $((1-a)-b)((1-b)-a)=(1-a-b)^2\geq 0,$ so $a(1-a)+b(1-b)\leq ab+(1-a)(1-b).$ Integrating over $t$ and applying (3) and (2) gives $\mathbb E|X-Y|\leq \mathbb E|X+Y|.$ The equality case is when $F_X(-t)+F_X(t)=1$ a.e., which is when $X$ is symmetric about zero.

Dap
  • 25,286
  • Very instructive solution indeed. Can you show me how to derive this formula for the expected value $\mathbb{E}|X-Y|= \int_{-\infty}^{\infty} \mathbb{P}[X\leq t < Y] + \mathbb{P}[Y\leq y < X] dt$ ? – Curtis74 Apr 15 '20 at 20:58
  • 2
    Curtis74: You need to replace $y$ by $t$. The formula is clear when $X,Y$ are constant, so it holds if you condition on the values of $X,Y$ on both sides. Now take expectations using Tonnelli on the RHS. – Yuval Peres Mar 13 '21 at 01:11
2

Way late to the party, but here's another approach. First verify the identity $$ |x+y|-|x-y|=2[\min(x^+,y^+)+\min(x^-,y^-)-\min(x^+,y^-)-\min(x^-,y^+)].\tag1$$ Next, use this result to argue that for nonnegative and independent $U$ and $V$: $$E\min(U,V)=\int_0^\infty P(\min(U,V)>t)\,dt=\int_0^\infty P(U>t)P(V>t)\,dt.\tag2 $$ Apply (2) four times to find, when $X$ and $Y$ are iid, $$ \begin{aligned} E\min(X^+,Y^+)&=\int_0^\infty P(X^+>t)P(Y^+>t)\,dt=\int_0^\infty P(X>t)P(X>t)\,dt\\ E\min(X^-,Y^-)&=\int_0^\infty P(X^->t)P(Y^->t)\,dt=\int_0^\infty P(-X>t)P(-X>t)\,dt\\ E\min(X^+,Y^-)&=\int_0^\infty P(X^+>t)P(Y^->t)\,dt=\int_0^\infty P(X>t)P(-X>t)\,dt\\ E\min(X^-,Y^+)&=\int_0^\infty P(X^->t)P(Y^+>t)\,dt=\int_0^\infty P(-X>t)P(X>t)\,dt \end{aligned}$$ Put everything together: $$E|X+Y|-E|X-Y|=2\int_0^\infty[P(X>t)-P(-X>t)]^2\,dt.$$ This final quantity is nonnegative, and equals zero iff $X$ has symmetric distribution about $0$.

grand_chat
  • 38,951