3

Is there any easy way to calculate the probability of the sum of two binomial random variable if the success rates of them are different each other?

I mean that $X \sim Bin(n,p_0)%$, $Y \sim Bin(m, p_1)$, $Z = X+Y$, $p_0 \neq p_1$ and hope to calculate the distribution function of $Z$.

I know the distribution of sum of random variables is calculated by convolution but I wonder if there is more easy way to get this. For example, $Z \sim Bin(m+n, p)$ if $p = p_0 = p_1$. Is there any similar formula in the case of $p_0 \neq p_1$?

2 Answers2

3

No. Sorry.

We look at the probability generating functions. $$\begin{align} \Pi_X(s) & = \mathsf E(s^X) \\[0ex] & = (sp_0-(1-p_0))^n \\[2ex] \Pi_Y(s) & = (sp_1-(1-p_1))^m \\[2ex] \Pi_{X+Y}(s) & = \mathsf E(s^{X+Y}) \\[0ex] & = \Pi_X(s)\Pi_Y(s) \\[0ex] & = (sp_0-(1-p_0))^n(sp_1-(1-p_1))^m \end{align}$$ Now, if $p=p_0=p_1$ then we would immediately have $\;\Pi_{X+Y}(s) = (sp+(1-p))^{m+n}\;$ which would indicate that $X+Y\sim\mathcal{Bin}(m+n, p)$ .

Unfortunately we clearly don't have as nice a result for $p_0\neq p_1$.

Graham Kemp
  • 129,094
1

We have to consider the moment generating function. $$\begin{align} M_Z(t) & = \mathbb E(e^{t(X+Y)}) = \\[0ex] & = \mathbb E(e^{tX}e^{tY}) \iff X indep. Y= \\[0ex] & = M_X(t)M_Y(t)= \\[2ex] & = (1-p_0+pe^t)^n (1-p_1+pe^t)^m \\ \end{align}$$

Which is the same result we get via convolution. The MGF allows to notice what are the parameters of the distribution of the sum of random variables; usually this is applied to sums of i.i.d. RVs, undoubtedly easier to compute. Moreover, this method doesn't apply to fat tailed distributions like lognormal distribution that don't have MGF.

james42
  • 1,160