4

I am trying to compute, for fixed $x,t\in \Bbb R_+$,

$$f(x,t)=\lim_{z\to \infty} e^{-zt}\sum_{k\le zx} \frac{(zx)^k}{k!}. $$

My attempt:
Since, $$e^x =\sum_{k\ge 0} \frac{x^k}{k!}$$

I said that the formula is $$f(x,t)=\lim_{z\to \infty} e^{-zt}\sum_{k\le zx} \frac{(zx)^k}{k!} = 0 $$ My professor told me that it is wrong.

I don't know how to arrive at this. any help?

Guy Fsone
  • 23,903

1 Answers1

5

So your function is: \begin{align} f(x,t) = \lim_{n\to \infty} e^{n(x-t)} g_n(x) \end{align} Where $g_n(x)=\sum_{k\leq nx} \frac{e^{-nx}(nx)^k}{k!}$.

The behavior of $g_n(x)$ can be obtained with the Central Limit Theorem. Let $Y_i \sim \text{Poi}(x)$ i.i.d. And note that the sum $\sum_{i=1}^n Y_i \sim \text{Poi}(nx)$. Using CLT we get: \begin{align} g_n(x)= \mathbb{P}(\sum_{i=1}^n Y_i\leq nx) = \mathbb{P}\left(\frac{\sum_{i=1}^n Y_i -nx }{\sqrt[]{nx}}\leq 0 \right ) \to \Phi(0)= \frac{1}{2} \ \ \ \text{as} \ \ \ n \to \infty \end{align} Where $\Phi(u)$ is the CDF of the standard normal distribution. So now we know $f$, namely: \begin{align} \tag{1} f(x,t) = \begin{cases} \frac{1}{2} & \text{ if } x=t \\ 0 & \text{ if } x<t \\ \infty & \text{ if } x>t \end{cases} \end{align}

Edit I:

Unfortunately I do not know how to prove (1) without Probability Theory, but I can prove that the claim: \begin{align}\tag{2} f(x,t) = \mathbf{1}_{[0,x]}(t) \end{align} is certainly not true with only elementary tools. Consider the point $(x,t)=(1,\ln(\frac 3 2 )$). Since $0<\ln(\frac 3 2 )$ and $\ln( \frac 3 2 )>1$ we have $f(x,t)=0$ according to (2). For $n>2$ have: \begin{align} e^{-n\ln(\frac 3 2)} \sum_{k=0}^n \frac{(n)^k}{k!} &= \left( \frac{2}{3}\right)^n \left( 1 + \sum_{k=1}^n \frac{(n)^k}{k!}\right) \\ & =\left( \frac{2}{3}\right)^n \left( 1 + \sum_{k=1}^n \frac{n\cdot n \cdot ... \cdot n}{k!}\right) \\ &\geq \left( \frac{2}{3}\right)^n \left( 1 + \sum_{k=1}^n \frac{(n-k+1)\cdot (n-k+2) \cdot ... \cdot (n-1)\cdot n}{k!}\right) \\ & =\left( \frac{2}{3}\right)^n \left( 1 + \sum_{k=1}^n \frac{n!}{k!(n-k)!}\right) \\ & = \left( \frac{2}{3}\right)^n \sum_{k=0}\binom{n}{k}\\ & = \left( \frac{2}{3}\right)^n 2^n = \left(\frac 4 3 \right)^n \end{align} So $f(1,\ln(3/2)) = \lim_{n\to\infty} e^{-n\ln(\frac 3 2)} \sum_{k=0}^n \frac{(n)^k}{k!} \geq \lim_{n\to\infty} \left(\frac 4 3 \right)^n = \infty$. And that shows that (2) is false.

Edit II:

Now an approach without using probability. However not for all $x \in \mathbb{R}^+$ but for all $x \in \mathbb{Z}^+$. Maybe someone else can make this proof general for all $x\in\mathbb{R}^+$. We use asymptotics to prove the value of $g_n(x)$. One can easily prove the following with induction: \begin{align} g_n(x) = \sum_{k=0}^{nx} e^{-nx}\frac{(nx)^k}{k!} = \frac{\Gamma(nx+1,nx)}{(nx)!} \end{align} Where $\Gamma(x,a)$ is the Incomplete Gamma function, namely: \begin{align} \Gamma(nx+1,nx) = \int^\infty_{nx} t^{nx}e^{-t}dt \end{align} We consider the asymptotics for $n\to \infty$ and that means $nx \to \infty$ so we can as well just consider $\Gamma(n +1 ,n)$ for $n\to \infty$. \begin{align} \Gamma(n+1,n) &= \int^\infty_{n} t^{n}e^{-t}dt \stackrel{(ns+n=t)}{=} n\int^\infty_0 (ns+n)^n e^{-ns-n} ds \\ &= e^{-n} n^{n+1} \int^\infty_0 (s+1)^ne^{-ns} ds\\ &= e^{-n} n^{n+1} \int^\infty_0 e^{-ns+n\ln(s+1)} ds \\ &= e^{-n} n^{n+1} \int^\infty_0 e^{-n(s-\ln(s+1))} ds \end{align} You can easily show that $h(s) = s-\ln(s+1)$ is an increasing function on $[0,\infty)$ and $h(0) = h'(0) = 0$. That means that the main contribution when $n \to \infty$ comes from near $s=0$ now choose $\phi(n)=n^{-1/4}$ so that when $n\to \infty$ we get $\phi(n)\to 0$ and we have: \begin{align} e^{-n} n^{n+1} \int^\infty_0 e^{-n(s-\ln(s+1))} ds \sim e^{-n} n^{n+1} \int^{\phi(n)}_0 e^{-n(s-\ln(s+1))} ds \end{align} Since the main contribution of the integral comes when $s$ is around $0$. Now we expand $h(s)$ in a Taylor polynomial around $s=0$, so $h(s) = \frac{s^2}{2!} + O(s^3)$. So: \begin{align} \Gamma(n+1,n) &\sim e^{-n}n^{n+1} \int^{\phi(n)}_0 e^{-ns^2/2 - nO(s^3)} ds \end{align} Since $\phi(n)$ is chosen so small, the $O(s^3)$ can be neglected: \begin{align} \Gamma(n+1,n) &\sim e^{-n}n^{n+1} \int^{\phi(n)}_0 e^{-ns^2/2 } ds \\ &\sim e^{-n}n^{n+1} \int^{\infty}_0 e^{-ns^2/2 } ds \\ &= e^{-n}n^{n+1} \frac{1}{2}\ \sqrt[]{\frac{2\pi}{n}} \\ & = \frac{1}{2} \sqrt[]{2\pi n } n^n e^{-n} \end{align} Finally we got the asymptotic of $\Gamma(n+1,n)$ and (indirectly) for $\Gamma(nx+1,nx)$.

For $(nx)!$ we use the well known Stirling asymptotic of the factorial namely when $n\to \infty$: \begin{align} (nx)! \sim \sqrt[]{2\pi nx } (nx)^{nx} e^{-nx} \end{align} So we did all this to evaluate $\lim_n g_n(x)$ so let's do it: \begin{align} \lim_{n\to \infty} g_n(x) = \lim_{n\to \infty} \frac{\Gamma(nx+1,nx)}{(nx)!} = \lim_{n\to \infty} \frac{\frac{1}{2} \sqrt[]{2\pi nx } (nx)^{nx} e^{-nx}}{\sqrt[]{2\pi nx } (nx)^{nx} e^{-nx}} = \frac{1}{2} \end{align} And that is exactly what CLT also gave us. And that proves (1) for $x\in \mathbb{Z}^+$.

Shashi
  • 8,738
  • Sorry but I don't how to do prove by software. many times I fall in that traps. Software are made humans and they not 100% perfect. Are you sure mathematic or... were prepared or programmed to face such situation? – Guy Fsone Nov 11 '17 at 16:20
  • See for instance here in a comment, where somebody said he used a software to contradict the question: https://math.stackexchange.com/questions/2507628/what-is-the-exact-value-of-int-infty-0-frac-sin2-xx5-2-dx – Guy Fsone Nov 11 '17 at 16:25
  • I am not expert in probabilty therefore I don't sufficient arguments to said that your proof is true or not – Guy Fsone Nov 11 '17 at 16:28
  • By the way this problem is the book of Berstein functions. – Guy Fsone Nov 11 '17 at 16:34
  • I saw you counter Example but your t is negative which does not count here. since x and t are positive real number – Guy Fsone Nov 12 '17 at 09:30
  • Oops. I have made some mistakes. Now it is fixed: $t>0$ and still the claim is false. – Shashi Nov 12 '17 at 09:35