11

Prove that $$\sum_{n=1}^{\infty} \frac{x^n \log(n!)}{n!} \sim x \log(x) e^x \,\,\,\text{as}\,\,\, x \to \infty$$ and $$\sum_{n=1}^{\infty} \frac{(-x)^n \log(n!)}{n!} \to 0 \,\,\,\text{as}\,\,\, x \to \infty$$ This question is related to my previous question. My heuristic approach is that the sum's major contribution comes from $n\approx x$ term, so

$$\sum_{n=1}^{\infty} \frac{x^n \log(n!)}{n!} \sim \frac{x^x\log(x!)}{x!}$$ but using the Stirling formula twice leads to $$\sum_{n=1}^{\infty} \frac{x^n \log(n!)}{n!} \sim \frac{\sqrt{x}\log(x)}{\sqrt{2\pi}}e^x$$ But this reasoning is flawed and I believe I should not ignore all the other terms. (Which I believe to account for $\sqrt{2\pi x}$factor) What kind of approach may give the wanted asymptotic behavior? Any kind of hint is welcome.

  • have u tried Euler-Mac Laurin formula? – tired Dec 02 '15 at 17:46
  • I proved that $\sum_{n<x} \frac{x^n \log(n!)}{n!} \ll x\log(x)e^x$, so it remains to prove that$\sum_{n\geq x} \frac{x^n \log(n!)}{n!} \sim x\log(x)e^x$, and in that case if I approximate the sum to integral and can expand the integrand about $n=x$ with $e^{(x-n^2)/{4x}}$ then I'm done. But I cannot justify the steps.. – generic properties Dec 03 '15 at 03:53

2 Answers2

9

I will try to show the first asymptotics.


We begin with the following quantitative form of Stirling's formula:

Fact. For all $n \geq 0$, $$ \log (n!) = (n + \tfrac{1}{2})\log(n+1) - n + \mathcal{O}(1). \tag{1} $$

Now let $N_t$ be a Poisson random variable of rate $t$. Then

\begin{align*} \smash[b]{\sum_{n=0}^{\infty} \frac{t^n \log (n!)}{n!}e^{-t}} &= \Bbb{E}[\log (N_t !)] \\ &= \Bbb{E}[N_t \log (N_t) + \tfrac{1}{2}\log (N_t + 1) - N_t + \mathcal{O}(1)] \\ &= \Bbb{E}[N_t \log (N_t + 1)] + \tfrac{1}{2}\Bbb{E}[\log(N_t + 1)] - t + \mathcal{O}(1). \tag{2} \end{align*}

Now we claim the following:

Claim. For any $a \geq 0$ we have $$ t\log(t+a) \leq \Bbb{E}[N_t \log(N_t + a)] = t \Bbb{E}[\log(N_t + a + 1)] \leq t \log(t+ a + 1). \tag{3} $$ Here, we use the convention that $0 \log 0 = 0$.

Assuming this claim, we easily find that

$$ \Bbb{E}[N_t \log(N_t + 1)] = t \log t + \mathcal{O}(1) \quad \text{and} \quad \Bbb{E}[\log(N_t + 1)] = \log t + \mathcal{O}(t^{-1}). $$

Plugging this to $\text{(2)}$ gives

$$ \sum_{n=0}^{\infty} \frac{t^n \log (n!)}{n!}e^{-t} = (t + \tfrac{1}{2})\log t - t + \mathcal{O}(1) = \log (t!) + \mathcal{O}(1). $$

Dividing both sides by $t \log t$ yields the first asymptotics.


Proof of Claim. The last inequality of $\text{(3)}$ is easy to prove. Since the function $x \mapsto \log(x+a+1)$ is concave, by the Jensen's inequality we have

$$ \Bbb{E}[\log(N_t + a + 1)] \leq \log(\Bbb{E} N_t + a + 1) = \log(t+a+1). $$

In order to show the first inequality of $\text{(3)}$, notice that $x \mapsto x\log(x+a)$ is convex (with the 2nd derivative $(2a+x)/(a+x)^2 > 0$). Thus by the Jensen's inequality again

$$ \Bbb{E}[N_t \log (N_t + a)] \geq (\Bbb{E}N_t) \log (\Bbb{E}N_t + a) = t \log(t+a). $$

Finally, the middle equality of $\text{(3)}$ is given by

\begin{align*} \Bbb{E}[N_t \log (N_t + a)] &= \sum_{n=1}^{\infty} n \log(n+a) \cdot \frac{t^n}{n!}e^{-t} \\ &= \sum_{n=0}^{\infty} \log(n+a+1) \cdot \frac{t^{n+1}}{n!}e^{-t} = t \Bbb{E}[\log (N_t + a + 1)]. \end{align*}

Sangchul Lee
  • 167,468
  • Looks good, but I fail to see why you need to introduce that the sum is a $\ldots$ of a Poisson process? You never seem to use that fact anywhere. – Winther Dec 09 '15 at 00:16
  • @Winther, You do not need a Poisson process. In fact, my first idea was to relate this to a Poisson process and then apply the law of large numbers. Then I discarded this approach without changing my notation. – Sangchul Lee Dec 09 '15 at 08:13
  • Thanks a lot for your answer. Do you have any suggestion for making the bound in the claim tighter? What could be done if I want to make the error term precise to the $O(x^{-1})$? – generic properties Dec 10 '15 at 16:51
  • @dielectric, Numerical simulation suggests that we begin to see some non-trivial behavior for the difference $$ \sum_{n=1}^{\infty} \frac{x^n \log (n!)}{n!} e^{-x} - \log (x!). $$ To be precise, it seems to converge to some non-zero constant as $x \to \infty$. At this point, I haven't thought about this seriously though. – Sangchul Lee Dec 11 '15 at 11:28
  • 2
    @Sangchul Lee, In fact, Ramanujan already attained the asymptotic form of $\sum_{n=1}^{\infty} \frac{x^n \log (n!)}{n!} e^{-x}$ (see entry 10 of Chapter 3 of his second notebook , by methods I don't actually understand well) precise to $O(x^{-2})$ and I wanted to generalize it to its variants $\sum_{n=1}^{\infty} \frac{x^{2n} \log ((2n)!)}{(2n)!} e^{-x}$ and I found your approach helps to the precision of $O(1)$, but it is still hard to show rigorously that the latter series is almost a half of the former. (Although I'm sure about that from numerical method) – generic properties Dec 11 '15 at 11:34
  • 1
    @dielectric, I think it is in his first notebook. Anyway, I just skimmed over the proof, and it seems that the main idea is to utilize the concentration of the Poisson distribution to obtain the asymptotic expansion $$ \Bbb{E}[\varphi(N_t)] = \sum_{n=0}^{\infty} \frac{\varphi^{(k)}(t)}{k!} \Bbb{E}[(N_t - t)^k]. $$ I guess that the proof can be modified to establish a analogous result for your case. – Sangchul Lee Dec 11 '15 at 22:12
3

Your idea is right that the entries near $n=x$ matter the most. In particular, the entries with $n=x+O(\sqrt{x})$ dominate this sum.

The following is not a complete answer, but I believe the holes can be filled in.

Suppose that $N=x + \alpha \sqrt{x},$ where $\alpha$ is fixed. Then, by Stirling's formula \begin{align*} \frac{\ln(N!)}{N!} x^N &= \left(1+O\left(\frac{1}{\ln x}\right)\right) \frac{(x+\alpha \sqrt{x}) \ln x}{\sqrt{2 \pi x} (x+\alpha \sqrt{x})^{x+\alpha \sqrt{x}}}e^{x+\alpha \sqrt{x}} x^{x+\alpha \sqrt{x}} \\ &=\left(1+O\left(\frac{1}{\ln x}\right)\right) x (\ln x) e^x \left[e^{\alpha \sqrt{x}} \left( \frac{x}{x+\alpha \sqrt{x}}\right)^{x+\alpha \sqrt{x}} \frac{1}{\sqrt{2\pi x}} \right]. \end{align*}

Now, note that \begin{align*} \left( \frac{x}{x+\alpha \sqrt{x}}\right)^{x+\alpha \sqrt{x}}&=\exp \left( -(x+\alpha \sqrt{x}) \cdot \ln\left(1+\frac{\alpha}{\sqrt{x}} \right)\right) \\&=\exp\left( -(x+\alpha \sqrt{x}) \left( \frac{\alpha}{\sqrt{x}} - \frac{\alpha^2}{2x} + O(n^{-3/2}) \right) \right) \\&= \exp \left( -\alpha \sqrt{x} - \alpha^2/2 + O(x^{-1/2}) \right). \end{align*}

Therefore $$ \frac{\ln N!}{N!}x^N = \left(1+O\left(\frac{1}{\ln x}\right)\right) [x \ln(x) e^x] \frac{e^{-\alpha^2/2}}{\sqrt{2\pi}} \frac{1}{\sqrt{x}} $$

Now the sum of these $e^{-\alpha^2/2}{\sqrt{2\pi}} \frac{1}{\sqrt{x}}$ over $N \in [x-C \sqrt{x}, x+C \sqrt{x}]$, $(C$ fixed w.r.t. $x$), is a Riemann sum for the integral $\int_{-C}^C \frac{e^{-t^2/2}}{\sqrt{2\pi}}dt$. This integral can be arbitrarily close to 1 for $C$ large enough.

It remains to be shown that for any $\epsilon >0$ (fixed w.r.t. $x$), $$ \sum_{n \notin [x-C \sqrt{x}, x+C \sqrt{x}]} \frac{\ln(N!)}{N!} x^N < \epsilon \, x \ (\ln x) \ e^x, $$ for $C$ large enough.

D Poole
  • 2,762