0

If $E$ is the expected value of a random variable, is $E(X^{-1})=\frac{1}{E(X)}$? From the properties of the expected value it looks like this shouldn't be true. However, I have used it in a couple of execises and it seemed to work out okey.

EDIT: For example I would like to use this property to calculate $E\left(\frac{X^2}{X^2+Y^2} \right)$ where $X, Y$ are independent stardard normally distributed random variables

kubo
  • 1,918

3 Answers3

4

Now, it's not. In particular, if the variable is strictly positive (and not -almost surely- constant), then, because $g(x)=1/x$ is convex, Jensen inequality tells us that

$$E(1/X) > \frac{1}{E(X)}$$

In some cases (eg asymptotics) the equation can be assumed as approximately true (namely if the variance tends to zero) - see here.

leonbloy
  • 63,430
3

If a random variable has an equal chance of being 1 or 2, the expected value is 1.5, so the reciprocal of the expected value is 2/3, but the expected value of the reciprocal is 3/4 (equal chance of being 1/2 or 1).

Paul
  • 8,153
1

One way to tell that this should not hold is that it's quite easy to "break" one side while leaving the other intact. For instance:

  1. If $X = \pm 1$ with equal probability, then $\mathbb E[X^{-1}]$ is $0$, but $1/\mathbb E[X]$ is undefined.

  2. If $X \in \{0, 1\}$ with equal probability, then $\mathbb E[X^{-1}]$ is undefined, while $1 / \mathbb E[X]$ is $2$.