This is a problem out of a book i'm working through: Mathematical Statistics with Applications
Let $Y_1, Y_2,...., Y_n$ denote a random sample from Poisson-distributed population with mean $\lambda$. In this case $U = \sum Y_i$ is a sufficient statistic for $\lambda$, and $U$ has a Poisson has a Poisson distribution with mean $n\lambda$. Use the conjugate gamma(\alpha, \beta) prior for $\lambda$ to do the following.
a) Show that the joint likelihood of $U$, $\lambda$ is:
$$L(u,\lambda) = \frac{n^u}{u!\beta^{\alpha}\Gamma(\alpha)}\lambda^{u+\alpha-1}e^{\frac{-\lambda}{\big(\frac{\beta}{n\beta+1}\big)}}$$
Now have worked it out a way that I think is correct if I use L like this:
$$L(y_1, y_2, ..., y_n|\lambda) = \frac{\lambda^{(\sum_{i=1}^{n}y_i+\alpha-1)}e^{\frac{-\lambda}{\big(\frac{\beta}{n\beta+1}\big)}}}{\prod_{i=1}^{n}y_i!\beta^{\alpha}\Gamma(\alpha)}$$
However I cannot get the answer they expect when using $u$, because how I understand the question, it's actually asking for: $$L(y_1+y_2+y_3+...+y_n|\lambda)$$
I'm not sure if it is something to do with being a sufficient statistic, but all I can say is, how in the hell do they get the $n^u$ on the denominator and the $u!$ on the denominator?