3

The following result must be well known, but it is instructive to give it here in view of the recent posts $Y= X+N$ what is ${\rm E}(X|Y=y)$ and https://mathoverflow.net/questions/47168/ex-1-x-1-x-2-where-x-i-are-integrable-independent-infinitely-divisib/47204#47204. This is also an interesting exercise in its own right.

Suppose that $X_1$ and $X_2$ are independent ${\rm Gamma}(c_i,\lambda)$ rv's, meaning that $X_i$, $i=1,2$, has density function $f_{X_i } (x) = \lambda ^{c_i } {\rm e}^{ - \lambda x} x^{c_i - 1} /\Gamma (c_i )$, $x > 0$ ($c_i$ are positive constants, $\Gamma$ is the gamma function). Show that $$ {\rm E}(X_1 | X_1 + X_2 = z) = \frac{{c_1 }}{{c_1 + c_2 }} z. $$

Shai Covo
  • 24,077
  • Of course, you are not allowed to apply the general principle indicated in the aforementioned posts (concerning infinitely divisible distributions). – Shai Covo Nov 24 '10 at 10:17

2 Answers2

4

First of all, we note that $X_1 + X_2 \sim {\rm Gamma}(c_1 + c_2 , \lambda)$. Given that $X_1 + X_2 = z$, $X_1$ cannot exceed $z$. We can thus find ${\rm E}(X_1|X_1+X_2=z)$ as follows. $$ {\rm E}(X_1|X_1 + X_2 = z) = \int_{0}^z {xf_{X_1|X_1 + X_2} (x|z)\,{\rm d}x} = \int_{0}^z {x\frac{{f_{X_1} (x)f_{X_1 + X_2|X_1} (z|x)}}{{f_{X_1 + X_2} (z)}}\, {\rm d}x}. $$ Noting that $f_{X_1 + X_2|X_1} (z|x) = f_{X_2} (z-x)$, it follows after some algebra and a change of variable that the right-hand side integral is equal to $$ z\frac{{\Gamma (c_1 + c_2 )}}{{\Gamma (c_1 )\Gamma (c_2 )}}\int_0^1 {x^{c_1 } (1 - x)^{c_2 - 1} \,{\rm d}x} = z\frac{{\Gamma (c_1 + c_2 )}}{{\Gamma (c_1 )\Gamma (c_2 )}}{\rm B}(c_1 + 1,c_2 ), $$ where ${\rm B}(a,b) = \int_0^1 {t^{a - 1} (1 - t)^{b - 1} \,{\rm d}t}$ is the beta function. Finally, from the alternative form ${\rm B}(a,b) = \frac{{\Gamma (a)\Gamma (b)}}{{\Gamma (a + b)}}$ followed by $\Gamma (p+1) = p \Gamma (p)$, we obtain $$ {\rm E}(X_1|X_1 + X_2 = z) = \frac{{c_1 }}{{c_1 + c_2 }} z. $$

Shai Covo
  • 24,077
3

another way to get this result is to first recall that $S := X_1+X_2$ and $R := X_2/X_1$ are independent. [this is well-known and can be proved by the usual method of derived distributions for two random variables. the mapping $(X_1,X_2) \to (R,S)$ is 1-1 and easily inverted. this is a standard example or exercise in many intro probability and math stat books - such as here.]

then, noting that $\frac{X_1}{X_1+X_2} = \frac{1}{1+R}$,

$$ {\mathrm E}\{ X_1 | S\} = {\mathrm E}\{ \frac{S}{1+R} | S\} = S\kern1pt {\mathrm E}\{ \frac{1}{1+R} | S\} = S\kern1pt{\mathrm E}\frac{1}{1+R}. \kern10pt (1)$$

the last equality in (1) follows from independence. also, [on repeating the argument - or just taking unconditional expectations in (1)],

$$ c_1 = {\mathrm E}X_1 = {\mathrm E}\frac{S}{1+R} = {\mathrm E}\frac{1}{1+R}\kern2pt {\mathrm E}S = {\mathrm E}\frac{1}{1+R}(c_1+c_2),$$

so

$$\kern84pt {\mathrm E}\frac{1}{1+R} = \frac{c_1}{c_1+c_2}, \kern84pt (2)$$

which can be plugged into (1) to get the result.

[btw, (2) says that

$${\mathrm E}\frac{X_1}{X_1+X_2} = \frac{{\mathrm E}X_1}{{\mathrm E}(X_1+X_2)},$$

a relation which, in general, is a fond wish of students in an intro prob or stats course - and perhaps the instructors too.]

ronaf
  • 403