3

If $X_i$'s are i.i.d. random variables then is this statement true?

$$E\left[\frac{1}{\sum_{i=1}^{n}X_i}\right] = \frac{1}{\sum_{i=1}^{n} E\left[X_i \right]}$$

Here $E\left[X\right]$ is the expected value of a random variable $X$

Edit - I was thinking that if each $X_i$ corresponds to the result of an independent random experiment, then will the given equation be true or false? I intuitively feel that if we perform these $n$ experiments an infinite number of times then the denominator will by very close to $\sum_{i=1}^{n}E[X_i]$ for a majority of the time.

  • @Chinny84 No, I was thinking that if each $X_i$ corresponds to the result of an independent random experiment, then will the given equation be true or false? I intuitively feel that if we perform these $n$ experiments an infinite number of times then the denominator will by very close to $\sum_{i=1}^{n}E[X_i]$ for a majority of the time. – Banach Tarski Mar 03 '16 at 11:47
  • http://math.stackexchange.com/questions/248472/expectation-on-1-x – Benjamin Lindqvist Mar 03 '16 at 11:57

4 Answers4

2

Take two fair-coins-like variables. $\mathbb{P}[X_i=2]=\mathbb{P}[X_i=1]=1/2$ for $i=1,2$. Then $$\mathbb{E}\left[\frac{1}{X_1+X_2}\right]=\frac{1}{4}\times \frac{1}{4}+\frac{1}{4}\times\frac{1}{2}+\frac{2}{4}\times \frac{1}{3}=17/48\ ,$$ while $$ \mathbb{E}[X_1+X_2]=\frac{1}{4}\times 4+\frac{1}{4}\times 2+\frac{2}{4}\times 3=3\ . $$

2

Your statement is false even for $n=1$. Take, for instance, on $\{1,\dotsc, k\}$ the variable $X(j) = j$ for $j = 1,\dotsc, k$, with uniform probability measure $\mathbb{P}({j}) = 1/k$. Then $$ E\left [ \frac{1}{X} \right ] = \sum_{j=1}^k \frac{1}{k} \frac{1}{j} = \frac{1}{k} \sum_{j=1}^k \frac{1}{j} $$ while $$ \frac{1}{E[X]} = \frac{1}{\frac{1}{k}\sum_{j=1}^k j} = \frac{2k}{k(k+1)} = \frac{2}{k+1} $$ which are different.

However, if the question is $$ \lim_{n \to \infty} E \left [ \frac{1}{\sum_{i=1}^n X_i} \right ] = \lim_{n \to \infty} \frac{1}{E \left [ \sum_{i=1}^n X_i \right ]} $$ I do not know.

Hugo
  • 437
  • Re the question at the end, if $(X_i)$ is i.i.d. with $E(X_1)\ne0$ then the limit on the RHS is zero hence the identity holds as soon as some domination ensures that the limit on the LHS is zero as well. – Did Mar 03 '16 at 13:58
2

Your statement is false, as pointed out by other answers. But your intuition is right, in the sense that asymptotically (under some conditions) the equation is true. In general, for any "well behaved" function $Y=g(Z)$ we can do a Taylor expansion around the mean ($E[Z]=\mu_Z$) and take expectations; we get:

$$Y \approx g(\mu_Z) + g'(\mu_Z) (Z-\mu_Z) + \frac{1}{2}g''(\mu_Z) (Z-\mu_Z)^2 +\cdots \tag{1}$$

$$\mu_Y \approx g(\mu_Z) + \frac{1}{2!}g''(\mu_Z) \; m_{2,Z} + \frac{1}{3!} g'''(\mu_Z) \; m_{3,Z} \cdots \tag{2}$$

where $m_{k,Z}$ is the $k-$th centered moment of $Z$.

In your case define $g(Z)=1/Z$ and $Z=(X_1+X_2 +\cdots+X_n)/n$. If (only if!) $X_i$ have finite moments, only the first term in $(2)$ survives for large $n$, and

$$E\left[\frac{n}{X_1+X_2 +\cdots+X_n}\right] \approx \frac{1}{E\left[\frac{X_1+X_2 +\cdots+X_n}{n}\right]}=\frac{1}{E[X_i]}$$

Or

$$E\left[\frac{1}{X_1+X_2 +\cdots+X_n}\right] \approx \frac{1}{n E[X_i]}$$

leonbloy
  • 63,430
  • How rigorous is this? Because it seems if we truncate the Taylor series at any order greater than $1$ the remainder may not be well controlled? Unless you are proposing to view the expansion as an infinite series where each term is smaller than the previous one with respect to $n$? – sonicboom Mar 19 '21 at 10:09
  • If we truncate the Taylor series at, say for example, 2nd order, the remainder will be of the form $\frac{1}{6}E[g'''(\zeta)(Z-\mu_Z)^3]$ for $\zeta = \mu_Z + c(Z - \mu_Z)$ for $c \in (0,1)$. Here $g'''(x) = -6/x^4$ which is unbounded. It seems this could be problematic because usually we need $g'''(x)$ bounded to control the remainder $|E[g'''(\zeta)(Z-\mu_Z)^3]| \le E[|g'''(\zeta)(Z-\mu_Z)^3|] \le C E[|(Z-\mu_Z)^3|]$. – sonicboom Mar 19 '21 at 10:13
1

If it would be true then also it would be true that: $$\mathbb E\left(\frac1{X}\right)=\frac1{\mathbb EX}$$

for special case $n=1$.

There are plenty of counterexamples.

drhab
  • 151,093