3

I believe that this is a particular neat example that we've done in class. Unfortunately there is one step I do not quite understand and my Professor had to skip due to the lack of time. I think this approach might be of interest, so I will post it despite its incompleteness.

Feel free to skip directly to the question tagged with (?)


Here is the idea:

Let $X \sim $Poisson(1) i.e. we have $P(X=k)= \frac{e^{-1}}{k!}$ for $k=0,1,2, \dots,$ and furthermore $E(X)=1=$Var$(X)$

Now consider a sequence of i.i.d. Poisson(1) RVs $X_1,X_2, \dots ,$ we then know that $(X_1 + \dots + X_n) \sim $Poisson($n$)

Lets now define yet another RV $Y_n$ as $$Y_n:= \frac{X_1 + \dots + X_n -n}{\sqrt{n}} $$ Then by the CLT we have that $Y_n \implies \mathcal{N}(0,1)$ where the arrow denotes weak convergence or convergence in distribution. This is equivalent to saying that $$P(Y_n >x) \to P(\mathcal{N} >x) \ , \forall x \in \mathbb{N} \text{ as } n \to \infty $$ But on the LHS of the above we obtain by Chebyshev's Inequality that $$ P(Y_n >x) = P(S_n-n > x \sqrt{n}) \leq \frac{1}{(x \sqrt{n})^2}\text{Var}(S_n)= \frac{1}{x^2} $$ Furthermore we have $\int_1^\infty 1/x^2 dx < \infty$, which justifies by Lebesgue dominate convergence that $$ \int_0^\infty P(Y_n >x) dx \to \int_0^\infty P ( \mathcal{N}>x)dx= \frac{1}{\sqrt{2 \pi}}\int_0^\infty y \exp \left( \frac{-y^2}{2}\right)dy = \frac{1}{\sqrt{2 \pi}} $$ My questions concerns the LHS of the above statement. Namely my Professor said that $$ \int_0^\infty P(Y_n >x)dx = E(Y_n^+)\overset{?}= e^{-n} \sum_{j=n+1}^\infty \frac{n^j}{j!} \left( \frac{j-n}{\sqrt{n}} \right) \overset{\checkmark}= \frac{e^{-n}}{\sqrt{n}} \frac{n^{n+1}}{n!} \tag{?} $$ Maybe I am missing something obvious here, I do understand that we want to compute expected value of the positive part of the random variable $Y_n$ denoted by $Y_n^+$, so I believe what my Professor actually is calculating is $$E(Y_n^+)=E(Y_n 1_{S_n \geq n}) $$ But I haven't dealt with such expected values in Class. Can maybe someone elaborate on the 'missing' steps?

Spaced
  • 3,499
  • 1
    Not an answer, but some combinatorial techniques due to Wilf allow us to prove much more (Stirling's inequality with arbitrary precision) in 20 lines or so: http://math.stackexchange.com/a/1409131/44121 – Jack D'Aurizio Jun 17 '16 at 21:48
  • Thanks for the reference. I also just noticed that there might be another weak point in this approach, because I don't see how I can rigorously justify that from $\int_1^\infty 1/x^2 dx$ the Lebesgue dominated convergence can be applied with limits of integration $0$ to $\infty$. Some extra work seems to be required here. – Spaced Jun 18 '16 at 12:08
  • This is odd, the accepted answer seems to not even touch the step (?). I thought this was the question you were asking? – Did Oct 03 '16 at 10:33
  • @Did, 3 months ago, quite the time for me to rewind. But the question above was indeed the (?) step that I could not figure out. I reasoned myself (as you can see in the question) what $E(Y_n^+)$ as I haven't seen that notion before and grand_chat agreed with me on that in the answer. His calculation seems sound to me, I do however not have your expertise in the field of probability (by far) but I don't see how the answer should "not even touch" the desired ? step. – Spaced Oct 03 '16 at 15:50
  • So the $\overset{\checkmark}=$ is clear? – Did Oct 03 '16 at 17:24
  • @Did, unfortunately I can't delete the above comment nor edit it, somehow I managed to mess up the LaTeX. the $=$ with the checkmark above is clear to me, rewriting the sum and noticing that it is telescopic. – Spaced Oct 03 '16 at 17:47
  • OK, I understand that you got all you needed. (Flagged your comment for deletion.) – Did Oct 03 '16 at 19:14

1 Answers1

2

The desired quantity to calculate is indeed $ E[Y_nI(S_n\ge n)].$ Since $Y_n:=\frac{S_n-n}{\sqrt n}$, this is the same as $$E\left[ \frac{S_n-n}{\sqrt n}I(S_n\ge n)\right].$$

To evaluate, use the familiar formula $$E[g(S_n)]=\sum_{j=0}^\infty g(j)P(S_n=j)\tag1$$ with the function $$ g(j):= \frac{j-n}{\sqrt n}I(j\ge n).\tag2$$ When you plug (2) into (1), you'll get zero if $j\le n$, and if $j>n$ the indicator equals 1. This should yield the RHS of ($\stackrel?=$), once you substitute the Poisson($n$) pdf for $P(S_n=j)$.

grand_chat
  • 38,951
  • Very nice! Thanks a lot! Mind me asking, is there a reason you go from $I(S_n \geq n$) to $I(S_n > n)$ in the first step? Obviously this will make the formula as stated in the question to turn out right, but is there a justification for this step? – Spaced Jun 17 '16 at 21:44
  • 1
    Sorry, I should have consistently used $S_n\ge n$. (Fixed now) This is indeed a neat application of the CLT. Another neat application of the CLT: if you toss a fair coin $2n$ times, the probability of exactly $n$ heads is approximately $1/\sqrt{\pi n}$. – grand_chat Jun 17 '16 at 22:03
  • Thanks for the reference, I love such neat applications!. – Spaced Jun 17 '16 at 22:06