0

This is a problem out of a book i'm working through: Mathematical Statistics with Applications

Let $Y_1, Y_2,...., Y_n$ denote a random sample from Poisson-distributed population with mean $\lambda$. In this case $U = \sum Y_i$ is a sufficient statistic for $\lambda$, and $U$ has a Poisson has a Poisson distribution with mean $n\lambda$. Use the conjugate gamma(\alpha, \beta) prior for $\lambda$ to do the following.

a) Show that the joint likelihood of $U$, $\lambda$ is:
$$L(u,\lambda) = \frac{n^u}{u!\beta^{\alpha}\Gamma(\alpha)}\lambda^{u+\alpha-1}e^{\frac{-\lambda}{\big(\frac{\beta}{n\beta+1}\big)}}$$

Now have worked it out a way that I think is correct if I use L like this:

$$L(y_1, y_2, ..., y_n|\lambda) = \frac{\lambda^{(\sum_{i=1}^{n}y_i+\alpha-1)}e^{\frac{-\lambda}{\big(\frac{\beta}{n\beta+1}\big)}}}{\prod_{i=1}^{n}y_i!\beta^{\alpha}\Gamma(\alpha)}$$

However I cannot get the answer they expect when using $u$, because how I understand the question, it's actually asking for: $$L(y_1+y_2+y_3+...+y_n|\lambda)$$

I'm not sure if it is something to do with being a sufficient statistic, but all I can say is, how in the hell do they get the $n^u$ on the denominator and the $u!$ on the denominator?

Bucephalus
  • 1,386

1 Answers1

1

You're unnecessarily complicating the question. You've already been told that if $U = \sum_{i=1}^n Y_i$, then $$U \sim \operatorname{Poisson}(n\lambda).$$ So in terms of the sufficient statistic $U$, the likelihood is simply $$\begin{align*} \mathcal L(u, \lambda \mid \alpha, \beta) &\propto f_U(u \mid \lambda) \, p(\lambda \mid \alpha,\beta) \\ &= e^{-n\lambda} \frac{(n \lambda)^u}{u!} \cdot \frac{\lambda^{\alpha - 1} e^{-\lambda/\beta}}{\beta^\alpha \Gamma(\alpha)} \\ &= \frac{n^u}{u! \, \beta^\alpha \Gamma(\alpha)} \lambda^{u+\alpha-1} e^{-\lambda(1/\beta + n)} . \\ \end{align*}$$ This demonstrates that the posterior distribution of $\lambda$ is gamma distributed with shape hyperparameter $u + \alpha - 1$ and scale hyperparameter $\beta/(n\beta + 1)$.

heropup
  • 135,869
  • Thanks a lot @heropup. I understand statistics just a little bit more now. Much appreciated. – Bucephalus Sep 25 '18 at 06:31
  • I have been reading a bit about sufficient statistics the last few hours, including this post that you have contributed to. https://math.stackexchange.com/questions/1186645/understanding-sufficient-statistic?rq=1 . So trying to understand the above solution more "sufficiently" a sufficient statistics allows us to estimate parameters, in this case the posterior, instead of using the samples. This is why we could simply use $u$ as $L(u|\lambda)$ instead of using the likelihood $L(y_1, y_2,...y_n|\lambda)$. Is this correct? – Bucephalus Sep 27 '18 at 10:51