15

Show using the Poisson distribution that

$$\lim_{n \to +\infty} e^{-n} \sum_{k=1}^{n}\frac{n^k}{k!} = \frac {1}{2}$$

Willie Wong
  • 73,139
wnvl
  • 3,010
  • 1
    Second hint, to supplement the Poisson hint: central limit theorem. (Is this (homework)?) – Did Mar 16 '12 at 22:28
  • It is not homework, just personal interest. I picked up the problem here: http://www.mymathforum.com/viewtopic.php?f=24&t=28627. – wnvl Mar 16 '12 at 22:32
  • 1
    @wnvl : You should be less formal when you ask questions here and show a little what you've tried or where you are stuck (or admit that you don't know where to start, if that is). We're humans too you know =P – Patrick Da Silva Mar 16 '12 at 22:55
  • The same question was asked here: http://www.sosmath.com/CBB/viewtopic.php?t=28258 – Martin Sleziak May 25 '12 at 11:03
  • The Poisson distribution has the properties that, if the mean is an integer, (a) the median is equal to the mean and (b) the modal values are the mean and one less than the mean. Property (a) implies that the sum in this question is at least $\frac12$ and that without its final term the sum would be less than $\frac12$, with the difference reducing towards $0$ as $n$ increases – Henry Dec 04 '17 at 00:15

1 Answers1

27

By the definition of Poisson distribution, if in a given interval, the expected number of occurrences of some event is $\lambda$, the probability that there is exactly $k$ such events happening is $$ \frac {\lambda^k e^{-\lambda}}{k!}. $$ Let $\lambda = n$. Then the probability that the Poisson variable $X_n$ with parameter $\lambda$ takes a value between $0$ and $n$ is $$ \mathbb P(X_n \le n) = e^{-n} \sum_{k=0}^n \frac{n^k}{k!}. $$ If $Y_i \sim \mathrm{Poi}(1)$ and the random variables $Y_i$ are independent, then $\sum\limits_{i=1}^n Y_i \sim \mathrm{Poi}(n) \sim X_n$, hence the probability we are looking for is actually $$ \mathbb P\left( \frac{Y_1 + \dots + Y_n - n}{\sqrt n} \le 0 \right) = \mathbb P( Y_1 + \dots + Y_n \le n) = \mathbb P(X_n \le n). $$ By the central limit theorem, the variable $\frac {Y_1 + \dots + Y_n - n}{\sqrt n}$ converges in distribution towards the Gaussian distribution $\mathscr N(0, 1)$. The point is, since the Gaussian has mean $0$ and I want to know when it is less than equal to $0$, the variance doesn't matter, the result is $\frac 12$. Therefore, $$ \lim_{n \to \infty} e^{-n} \sum_{k=0}^{n} \frac{n^k}{k!} = \lim_{n \to \infty} \mathbb P(X_n \le n) = \lim_{n \to \infty} \mathbb P \left( \frac{Y_1 + \dots + Y_n - n}{\sqrt n} \le 0 \right) = \mathbb P(\mathscr N(0, 1) \le 0) = \frac 12. $$

Hope that helps,

Did
  • 279,727
  • 2
    Edited some confusion between $X_1$ and $Y_i$, just revert to the previous version if you disagree. // The end of the argument does not apply because $\sigma$ depends on $n$ hence $P(N(1,\sigma)\leqslant1)$ cannot be a limit when $n\to\infty$. The correct approach is to apply the CLT to the event $[X_n\leqslant n]=[(S_n-n)/\sqrt{n}\leqslant0]$ where $S_n=Y_1+\cdots+Y_n$ hence $(S_n-n)/\sqrt{n}$ converges in distribution to $N(0,a)$ for some positive $a$ whose value is irrelevant. – Did Mar 17 '12 at 11:54
  • 2
    Curious to know how many upvoters understand the answer... :-) – Did Mar 17 '12 at 11:56
  • @DidierPiau: Not me. Nice avatar! – Tim Mar 17 '12 at 12:04
  • @Didier Piau : $\mathbb P( \mathscr N(1,\sigma_n) \le 1 )$ does not depend on $\sigma$, so I understand I did things wrong because I didn't apply CRT the most natural way, but what I said still stands, does it? – Patrick Da Silva Mar 17 '12 at 16:09
  • @Didier Piau : I've given a little bit more thought about your comment and edited my answer. I was actually worried about the $X /Y$ thing, but you made it clear. And yes, usually when we "switch to the normal approximation" we always subtract the mean and divide through by the variance... I shouldn't have lost that reflex. After reading a little bit on the CLT again I got back on track and agreed with you. I edited my answer to reflect that. – Patrick Da Silva Mar 17 '12 at 16:13
  • Good job. +1. $ $ – Did Mar 17 '12 at 16:33
  • @PatrickDaSilva Did you switch lim and P? If so, why are you allowed to do that? If not, what did you do in penultimate step? – BCLC Aug 07 '15 at 22:50
  • 1
    @BCLC : That's precisely the CLT : that a sum of i.i.d variables (minus the average divided by standard deviation) converges in distribution to the normal distribution, that is, $\lim_{n \to \infty} \mathbb P \left( \frac {Y_1 + \cdots + Y_n - \mu}{\sigma} \le x \right) \to \mathbb P \left(Z \le x \right)$ when $Y_i$ follows some distribution of mean $\mu$ and variance $\sigma^2$ and $Z \sim \mathcal N(0,1)$. So I did not really switch $\lim$ and $\mathbb P$ properly speaking, I just applied the CLT ; that's because the CLT only guarantees convergence in distribution. – Patrick Da Silva Aug 08 '15 at 03:02
  • @PatrickDaSilva You mean this and then plug in x = 0? – BCLC Aug 08 '15 at 03:16
  • 1
    @BCLC : Yes, because the normal distribution is given by a smooth (and in particular continuous) density. – Patrick Da Silva Aug 08 '15 at 03:17
  • is there any other way to prove that sum of n poiss(1) is same in distribution as one poiss(n) variable without using characteristic functions? using characteristic function is easy but im curious if theres another way to prove that distribution are the same – james black Apr 22 '18 at 07:28
  • @james black : if you recall the intuition behind the Poisson distribution, it is somewhat obvious. The Poisson distribution tells you the probability that k events happen within a given time interval. When you add two independent Poisson distributions, since the logic remains the same (by independence and time-independence) but the expectations add up... ;) – Patrick Da Silva Apr 22 '18 at 10:27