1

Suppose $X_i$ (where $X_i$ are i.i.d exponential random variables with parameter $\lambda$) is the time it takes for someone to complete a task. Each attempt is successful with prob $p$. We assume independence for each attempt and independence of the durations.

I have modeled the total time as the following:

$S_N = X_1 +X_2 +...X_n$ where $P(N=n) = p(1-p)^{n-1}$. Additionally the moment generating function for each $X_i$ is given by $M_{X_i}(t) = \frac{1}{1-\lambda{t}}$.

Is is the incorrect intuition to just proceed from the following:

$M_{S_N}(t)$ = ($\prod{M_{X_i}})*p(1-p)^{n-1}$

2 Answers2

2

We observe that $$M_S(t) = \operatorname{E}[e^{tS}] = \operatorname{E}[\operatorname{E}[e^{tS} \mid N]],$$ where the outer expectation is taken with respect to $N$ and the inner with respect to $S$ conditioned on $N$. Since $$e^{tS} \mid N = \prod_{i=1}^N e^{t X_i},$$ we have $$\operatorname{E}[e^{tS} \mid N] \overset{\text{ind}}{=} \prod_{i=1}^N \operatorname{E}[e^{tX_i}] = \left(M_X(t)\right)^N = \left(\frac{\lambda}{\lambda-t}\right)^N,$$ where $X$ is an exponential random variable parametrized by rate $\lambda$. Consequently, $$M_S(t) = \operatorname{E}\left[e^{N \log (\lambda/(\lambda - t))}\right] = M_N\left(\log \frac{\lambda}{\lambda - t}\right),$$ where $N \sim \operatorname{Geometric}(p)$ with parametrization $$\Pr[N = n] = (1-p)^{n-1} p, \quad n = 1, 2, \ldots.$$ We require $N$ to have support beginning at $1$ since it is not possible to correctly answer a question in $N = 0$ responses. This parametrization has MGF $$M_N(t) = \frac{pe^t}{1 - (1-p)e^t},$$ from which we obtain $$M_S(t) = \frac{\frac{p\lambda}{\lambda - t}}{1 - \frac{(1-p)\lambda}{\lambda - t}} = \frac{p\lambda}{p\lambda - t},$$ which is the MGF of an exponential distribution with rate $p\lambda$, as desired.

heropup
  • 135,869
1

You know that the moment generating function characterises the distribution and that the expected value for the product of function of independent random variable is the product of the expected value, then: $$M_S(t)=E[e^{tS}]=E[e^{t\sum_{i=1}^n X_i}]=\prod_{i=1}^n E[e^{tX_i}]=\left(\frac{1}{1-\lambda t}\right)^n$$ where the last function is a Gamma with parameter $n$ and $\lambda$

in general a sum of $k$ i.i.d. exponential random variable with parameter $\beta$ is a Gamma with parameters $(k,\beta)$

Cuoredicervo
  • 429
  • 3
  • 15
  • Yes, thank you! However, I was wondering how how the p parameter (probability of success at each attempt) or $p(1-p)^{n-1}$ is incorporated into this. After all I need to show that the total time before the problem is solved has an exponential distribution –  Jan 30 '17 at 20:25