0

If $X_i \sim Bernoulli(\theta)$, then the $\sum_1^n X_i \sim Binomial(n, \theta)$

I don't know how it is derived, could anyone show or prove it to me.

Besides, there are similar knowledges such as $X_i \sim Exp(1/\theta)$, then the $\sum_1^n X_i \sim Gamma(n,1/\theta)$ and I believe it is the Same method.

Many thanks !!

  • In the first case $X_i$ can be either 0 or 1. So, the sum of $n$ such tries can be any number between 0 and $n$. Now, to calculate the probability of $\sum X_i=k$ you need to consider all possible permutations of $k$ number of ones in $n$ tries which is $\frac {n!}{k!(n-k)!}$. Then, $P{\sum X_i=k}=\frac {n!}{k!(n-k)!}p^k(1-p)^{n-k}$ where $p$ is the probability of 1 in a single try ($p^k(1-p)^{n-k}$ is the probability of a particular permutation)... –  Apr 12 '14 at 09:45
  • 1
    Did you also look on wikipedia, etc. or google? This is pretty standard. – mathse Apr 12 '14 at 09:52
  • Thanks for answering the question! It seems like this is Sth need to think and then write down directly? I actually notices that if I derive the MGF of Xi and then compute MGF of sum of Xi, then I can check the corresponding distribution and find it out, is that correct? – user137354 Apr 12 '14 at 09:59
  • It's pretty straightforward in the first case... Follow the link for the second: http://math.stackexchange.com/questions/655302/gamma-distribution-out-of-sum-of-exponential-random-variables –  Apr 12 '14 at 10:04
  • thanks so much, i am well understood now – user137354 Apr 12 '14 at 18:48

0 Answers0