1

Using an appropriate probability distribution or otherwise show that

$$\lim_{n\to\infty} \int_0^n e^{-x}{x^{n-1}\over(n-1)!}dx =0.5$$

Davide Giraudo
  • 172,925
Argha
  • 4,671

2 Answers2

2

Let $\{X_n\}$ a sequence of independent identically distributed random variable of exponential law of mean $1$, that is, a density of $X_1$ is $$f(x):=e^{-x}\chi_{\{x\geq 0\}}.$$ We want to know a density $f_n$ of $S_n:=\sum_{k=1}^nX_k$. We can use induction: $f_n(x)=e^{-x}\frac{x^{n-1}}{(n-1)!}\chi_{\{x\geq 0\}}$. It's true for $n=1$ and if it's true for a $n$, we use convolution: \begin{align} f_{n+1}(x)&=\int_{\Bbb R}f_n(t)f_1(x-t)dt\\ &=\int_{\Bbb R}e^{-t}\frac{t^{n-1}}{(n-1)!}\color{green}{\chi_{(0,+\infty)}(t)}e^{-(x-t)}\color{red}{\chi_{(0,+\infty)}(x-t)}dt\\ &=e^{-x}\int_{\color{green}0}^{\color{red}x}\frac{t^{n-1}}{(n-1)!}dt\\ &=e^{—x}\frac{x^n}{n!}. \end{align} Denote $I_n:=\int_0^ne^{—x}\frac{x^{n-1}}{(n-1)!}dx$. We have, since $X_n\geq 0$ and the integrand is a density of $S_n$, that \begin{align} I_n&=P\left(\sum_{j=1}^nX_j\leq \color{red}n\right)\\ &=P\left(\sum_{j=1}^n(X_j\color{red}{-1})\leq 0\right)\\ &=P\left(\frac{\sum_{j=1}^nX_j\color{red}{-E[X_j]}}{\sqrt n}\leq 0\right), \end{align} the expectation of $X_1$ being $1$. Since the set $(\infty,0]$ has a boundary of measure $0$ and $\frac{\sum_{j=1}^nX_j-E[X_j]}{\sqrt n}$ converges in law to a normal law of mean $0$ and variance $1$, say $N$ (it's given by the central limit theorem), we have by portmanteau theorem, $$\lim_{n\to +\infty}I_n=P(N\leq 0)=1/2,$$ $N$ being symmetric.

Davide Giraudo
  • 172,925
1

How about:

$\mathcal{L}( \int_0^n e^{-x}{x^{n-1}\over(n-1)!}dx) = \frac{1}{s}\mathcal{L}(e^{-x}{x^{n-1}\over(n-1)!}) = \frac{1}{s(s + 1)^n}$

$\mathcal{L}^{-1}(\frac{1}{s(s + 1)^n}) = 1 - \frac{\Gamma(n,n)}{\Gamma(n)}$

$\lim_{n\to\infty} 1 - \frac{\Gamma(n,n)}{\Gamma(n)} = 1 - \frac{1}{2} = \frac{1}{2}$

Edit:

The above hack could rightly be criticized as sort of "begging the question"; it's certainly not clear that evaluating the limit involving the upper incomplete gamma function is any easier than the original problem involving the lower! So, let's try this:

Define:

$P(n) = \frac{1}{\Gamma(n)}\int_0^n e^{-x}{x^{n-1}}dx$

$Q(n) = \frac{1}{\Gamma(n)}\int_n^\infty e^{-x}{x^{n-1}}dx$

So it should be fairly obvious that for all real n:

$P(n) + Q(n) = 1$

Then:

$\mathcal{L}(P(n) + Q(n)) = \mathcal{L}(1) \implies \frac{1}{s(s + 1)^n} + \mathcal{L}Q(n) = \frac{1}{s}$

$\implies \mathcal{L}Q(n) = \frac{1}{s} - \frac{1}{s(s + 1)^n}$

$\lim_{x\to\infty} P(n) = \lim_{s\to 0}s\mathcal{L}P(n) = 1$

$\lim_{x\to 0} Q(n) = \lim_{s\to\infty}s\mathcal{L}Q(n) = 1$

So,

$\lim_{n\to\infty} P(n) + Q(n) = 1$

$\lim_{n\to\infty}P(n) - Q(n) = 0$,

from which the respective limits follow.

MattyZ
  • 2,313