2

$$\lim_{n\to\infty} \frac{\gamma(n,n)}{\Gamma(n)}$$

where $$\gamma(s,x)=\int_0^x t^{s-1} e^{-t}$$

This limit is part of my attempt to measure the divergence rates of $\Gamma(x)$ and the convergence rates of the functions $\Gamma(x)$ and $\gamma(x,n)$, right.

Olivier Oloa
  • 120,989
  • with $t_{max}(n)$ being the $t \in [0;\infty[$ where $|t^{n-1} e^{-t}|$ is maximum, it is enough to prove that $t_{max}(n) < n-1$ because when $n \to \infty$ : $\int_0^\infty t^{n-1} e^{-t} dt \sim \int_{t_{max}-1}^{t_{max}+1} t^{n-1} e^{-t} dt$ – reuns Mar 05 '16 at 10:27

3 Answers3

3

One may write $$ \gamma(n,n)=\int_0^n t^{n-1} e^{-t}=\int_0^\infty t^{n-1} e^{-t}-\int_n^\infty t^{n-1} e^{-t} $$ giving $$ \frac{\gamma(n,n)}{\Gamma(n)}=1-\frac{\Gamma(n,n)}{\Gamma(n)} \tag1 $$ with the incomplete gamma function $\Gamma(\cdot,\cdot)$.

Then, as $n \to \infty$, by using the Stirling approximation formula conjointly with the following known asymptotic expansion: $$ \Gamma(n,n)=n^{n-1}e^{-n}\left(\sqrt{ \frac{\pi}{2}}\sqrt{n}-\frac13+\frac{\sqrt{2\pi}}{24\sqrt{n} }+\mathcal{O}\left(\frac1n\right)\right) \tag2 $$ one gets, as $n \to \infty$,

$$ \frac{\gamma(n,n)}{\Gamma(n)}=\frac12+\mathcal{O}\left(\frac1{\sqrt{n}}\right) $$

and we obtain the desired limit.

Olivier Oloa
  • 120,989
2

Let $T_1, T_2, \cdots$ be i.i.d. exponential r.v.s with rate 1. Then the sum $S_n = T_1 + \cdots + T_n$ has the gamma distribution of rate 1 and order $n$:

$$ \Bbb{P}(S_n \leq x) = \int_{0}^{x} \frac{t^{n-1}e^{-t}}{(n-1)!} \, dt = \frac{\gamma(n, x)}{\Gamma(n)}, \quad x \geq 0. $$

Now by the classical CLT, if $Z \sim \mathcal{N}(0, 1)$ denotes any standard normal variable, it follows that

$$ \frac{\gamma(n, n)}{\Gamma(n)} = \Bbb{P}( S_n \leq n ) = \Bbb{P}\left( \frac{S_n - \Bbb{E}S_n}{\sqrt{\mathrm{Var}(S_n)}} \leq 0 \right) \xrightarrow[n\to\infty]{} \Bbb{P}(Z \leq 0) = \frac{1}{2}. $$

Alternatively, let $(N_t)$ be a Poisson process of rate 1. Then the distribution of $S_n$ is the same as the distribution of the $n$-th arrival time of $(N_t)$. Thus

$$ \Bbb{P}(S_n \leq n) = \Bbb{P}(N_n \geq n) = \sum_{k=n}^{\infty} \frac{n^k}{k!}e^{-n}. $$

We can prove that it converges to $1/2$ as well. For a purely analytic solution, see this answer, for instance.

Sangchul Lee
  • 167,468
1

In this answer, it is shown that $$ \begin{align} e^{-n}\sum_{k=0}^n\frac{n^k}{k!} &=\frac{1}{n!}\int_n^\infty e^{-t}\,t^n\,\mathrm{d}t\\ &=\frac12+\frac{2/3}{\sqrt{2\pi n}}+O\left(\frac1n\right)\tag{1} \end{align} $$ $(1)$ says that $$ \begin{align} \frac{\gamma(n+1,n+1)}{\Gamma(n+1)} &=1-\frac1{n!}\int_n^\infty e^{-t}t^n\,\mathrm{d}t\\ &=\frac12-\frac{2/3}{\sqrt{2\pi n}}+O\left(\frac1n\right)\tag{2} \end{align} $$ Therefore, $$ \frac{\gamma(n,n)}{\Gamma(n)} =\frac12-\frac{2/3}{\sqrt{2\pi n}}+O\left(\frac1n\right)\tag{3} $$

robjohn
  • 345,667