1

I want to prove an analysis result using a probabilistic approach. If $X \sim \Gamma(\alpha_1,\beta)$ and $Y \sim \Gamma(\alpha_2, \beta)$ then $ Z= X+Y \sim \Gamma(\alpha_1 +\alpha_2, \beta)$. While proving that $Z \sim \Gamma(\alpha_1 + \alpha_2, \beta)$ I am trying to show this $$ \int_0^1u^{\alpha_1 -1}(1-u)^{\alpha_2 -1}du = \frac{\Gamma(\alpha_1) \Gamma(\alpha_2)}{\Gamma(\alpha_1 +\alpha_2)} $$ However, I can't seem to figure it out. I used the the convolution to find the PDF of $Z$ but then I got stuck.

Anonymous
  • 115
  • Where have you looked? It is in many many textbooks, and on-line. https://en.wikipedia.org/wiki/Beta_function – GEdgar May 23 '18 at 10:33
  • I've never encountered it. I looked at this post https://math.stackexchange.com/questions/2779311/examples-of-analysis-results-using-probability-theory and I am trying to prove the second statement. – Anonymous May 23 '18 at 10:46
  • For integer $\alpha_1$ and $\alpha_2$, you could take advantage of the fact that the sum of $n$ independent identically distributed exponential random variables has a gamma distribution. – awkward May 23 '18 at 12:11
  • This is the simplest derivation I can find of what you're looking for. – Clarinetist May 23 '18 at 12:44
  • Thank you, but I am not interested in the simplest derivation. I would like to see a derivation from a different approach using probability. @Clarinetist – Anonymous May 23 '18 at 13:12
  • @awkward I know this fact, however how do you proceed from there on? – Anonymous May 23 '18 at 13:12
  • If $X$ is the sum of $\alpha_1$ iid variables and $Y$ is the sum of $\alpha_2$ variables, then $X+Y$ is the sum of... – awkward May 23 '18 at 13:50
  • Oh, sorry, I understand the question now. Why not try moment-generating functions? – Clarinetist May 23 '18 at 15:16
  • @Clarinetist How do you then proceed? – Anonymous May 23 '18 at 15:45

1 Answers1

2

Let $X \sim \text{Gamma}(\alpha_1, \beta)$ and $Y \sim \text{Gamma}(\alpha_2, \beta)$ be independent random variables.

The moment-generating functions (MGFs) of $X$ and $Y$ are $M_{X}(t) = \left(\dfrac{\beta}{\beta-t}\right)^{\alpha_1}$ and $M_Y(t) = \left(\dfrac{\beta}{\beta-t} \right)^{\alpha_2}$ respectively.

Recall that $M_{X+Y}(t) = M_X(t)M_Y(t)$, because $$M_{X+Y}(t) = \mathbb{E}[e^{t(X+Y)}] = \mathbb{E}[e^{tX+tY}]=\mathbb{E}[e^{tX}e^{tY}]=\mathbb{E}[e^{tX}]\mathbb{E}[e^{tY}]=M_{X}(t)M_{Y}(t)$$ where $\mathbb{E}[e^{tX}e^{tY}]=\mathbb{E}[e^{tX}]\mathbb{E}[e^{tY}]$ due to independence of $X$ and $Y$.

Hence,

$$M_{X+Y}(t)=\left(\dfrac{\beta}{\beta-t}\right)^{\alpha_1}\left(\dfrac{\beta}{\beta-t}\right)^{\alpha_2} = \left(\dfrac{\beta}{\beta-t}\right)^{\alpha_1+\alpha_2}$$ This is the MGF of a $\text{Gamma}(\alpha_1+\alpha_2, \beta)$ random variable; hence, $X+Y \sim \text{Gamma}(\alpha_1+\alpha_2, \beta)$.

Clarinetist
  • 19,519