9

I don't know how to solve Exercise 8, Section 5.2 from Geoffrey G. Grimmett, David R. Stirzaker, Probability and Random Processes, Oxford University Press 2001. For those who don't have this book:

Let $X$ have a Poisson distribution with parameter $\Lambda$, where $\Lambda$ is exponential with parameter $\mu$. Show that $X$ has a geometric distribution.


$X \sim Poiss(\Lambda),\ \ \Lambda \sim Exp(\mu)$.

So we know that generating function of $X$ is $G_x(s) = \sum_{i=0} s^i \frac{\Lambda^i}{i!} e^{-\Lambda}= e^{\Lambda(s-1)}$.

Probability density function of $\Lambda$ is $f_{\Lambda} = \mu e^{-\mu x}$.

And I don't know what I should do next. How to decompose $\Lambda$ in $G_x$ (or maybe this is not a good idea?).

Thanks in advance for your help.

2 Answers2

12

For every nonnegative integer $n$, $$\mathbb P(X=n\mid\Lambda)=\mathrm e^{-\Lambda}\frac{\Lambda^n}{n!}$$ hence $$ \mathbb P(X=n)=\mathbb E(\mathbb P(X=n\mid\Lambda))=\int_0^{+\infty}\left(\mathrm e^{-\lambda}\frac{\lambda^n}{n!}\right)\,f_\Lambda(\lambda)\,\mathrm d\lambda=\int_0^{+\infty}\left(\mathrm e^{-\lambda}\frac{\lambda^n}{n!}\right)\,\mu\mathrm e^{-\mu\lambda}\,\mathrm d\lambda $$ where the first equality comes from the Law of Total Expectation and Can we prove the law of total probability for continuous distributions?. The change of variable $x=(1+\mu)\lambda$ in the rightmost integral yields $$ \mathbb P(X=n)=\frac{\mu}{(1+\mu)^{n+1}}\int_0^{+\infty}\mathrm e^{-x}\frac{x^n}{n!}\mathrm dx=\frac{\mu}{(1+\mu)^{n+1}} $$ To sum up, $$ \mathbb P(X=n)=(1-p)p^n\qquad p=\frac1{1+\mu} $$ That is, the distribution of $X$ is geometric with parameter $p$.

D.R.
  • 8,691
  • 4
  • 22
  • 52
Did
  • 279,727
  • After changing of variable I get different result, please correct me: $$x = \mu \lambda \ dx = \mu d\lambda \ \int_0^{\infty} e^{-\frac{x}{\mu}} \frac{x^n}{\mu^n n!} \mu e^{-x} \frac{1}{\mu} dx = \frac{1}{\mu^n} \int_0^{\infty} e^{-x} \frac{x^n}{n!} e^{-\frac{x}{\mu}} dx$$ – user52354534 Jan 18 '13 at 17:26
  • And why does $\int_0^{\infty} e^{-x} \frac{x^n}{n!} dx$ equal $1$ ? – user52354534 Jan 18 '13 at 17:45
  • Oh, I see - this is the Gamma Function, so I guess there is no simple explanation? :) Could you also explain me the first two equality: $$P(X=n|\Lambda) = e^{-\Lambda} \Lambda^n / n! \ P(X=n) = \int_0^{\infty} e^{-\lambda} \frac{\lambda^n}{n!} f_{\Lambda} (\lambda) d\lambda$$ What formula or theorem it is? – user52354534 Jan 18 '13 at 18:08
  • The correct change of variable is $x=(1+\mu)\lambda$, sorry about that. – Did Jan 18 '13 at 20:45
  • Regarding the formula for $P(X=n\mid\Lambda)$, I wonder what is to explain. Here is a question: how would you translate the hypothesis that the distribution of $X$ is Poisson with parameter $\Lambda$? – Did Jan 18 '13 at 20:47
  • I think I needed that: http://en.wikipedia.org/wiki/Marginal_distribution But still it is not for me entirely clear... To answer your question, maybe something like this: $P(X = n, \Lambda = \lambda)$ ? Why we translate is as conditional probability, not as joint probability? – user52354534 Jan 19 '13 at 00:30
  • For one thing, because $P(X=n,\Lambda=\lambda)=0$ for every $(n,\lambda)$. // Are you telling me that you were asked to solve this without having been exposed to the notion of conditional distribution? – Did Jan 19 '13 at 07:52
  • Why does the first equality : $P(X=n)=E(P(X=n|\Lambda)$ holds? – Little Rookie Aug 29 '17 at 23:17
  • 1
    @LittleRookie General fact about conditional expectations/probabilities: $$E(P(A\mid Z))=P(A)$$ just like $$E(E(Y\mid Z))=E(Y)$$ – Did Aug 29 '17 at 23:19
  • Is there a link that i can know more about the equality? – Little Rookie Aug 29 '17 at 23:25
  • 1
    @LittleRookie https://en.wikipedia.org/wiki/Law_of_total_expectation – Did Aug 29 '17 at 23:35
  • Thanks, can i have some intuition on the equality? – Little Rookie Sep 18 '17 at 08:41
  • @LittleRookie Not sure what you are really asking for. If you have a separate question, please ask it as another post. – Did Sep 18 '17 at 13:50
  • Hi! Can you explain how you compute this integral? I don't understood how to integrate this. $$\int_{0}^{\infty} \frac{e^{-x} x^n}{n!} dx$$ – Matheus Sousa Jul 05 '22 at 19:46
  • @MatheusSousa The gamma function is defined by $\Gamma(z)=\int_0^\infty x^{z-1}e^{-x}\ \mathsf dx$, and satisfies $\Gamma(n+1)=n!$ for nonnegative integers $n$. So the given integral is simply $\Gamma(n+1)/n! = 1$. – Math1000 Oct 13 '23 at 03:03
3

Be careful, $\Lambda$ is a random variable! So your computation only shows that $$ E[s^X \mid \Lambda] = \sum_{n=0}^\infty \frac{(s\Lambda)^n}{n!}e^{-\Lambda} = e^{\Lambda (s-1)}. $$

Now you should be able to compute $$G_X(s) = E[s^X] = E\left[E[s^X\mid \Lambda]\right].$$

Siméon
  • 10,664
  • 1
  • 21
  • 54
  • I don't quite understand second equality, could you explain it? Why $s^X = E(s^X | \Lambda)$ ? Provided that, $G_X(s) = E(e^{\Lambda (s-1)}) = \int e^{x(s-1)} \mu e^{-\mu x} dx = \frac {\mu e^{x(s-1-\mu)}}{s-1-\mu}$. Is it okay? If yes, how to deduce that $X$ has a geometric distribution? – user52354534 Jan 17 '13 at 22:26
  • 2
    I never wote such a thing as $s^X = E(s^X\mid\Lambda)$. I am only using the very basic property of conditional expectation $E[Z]=E\left[E[Z\mid \mathcal{F}]\right]$. Besides, the value of the integral cannot depend on the variable of integration $x$... – Siméon Jan 18 '13 at 11:21