27

More precisely,

$$\sum_{k=1}^n \binom{n}{k}(-1)^k \log k = \log \log n + \gamma +\frac{\gamma}{\log n} -\frac{\pi^2 + 6 \gamma^2}{12 \log^2 n} +O\left(\frac1{\log ^3 n}\right).$$

This is Theorem 4 from Flajolet, Sedgewick (1995) and was obtained using Hankel integration. It is related to analysis of quadtrees and digital search trees.

I know from 'Concrete Mathematics' that similar-looking sums can be obtained and solved using discrete form of derivatives, $\nabla f(x)=f(x+1)-f(x)$ and so $n^{\text{th}}$ difference is

$$\nabla^n f(x)=\sum_{k=0}^{n} \binom{n}{k} (-1)^{n-k} f(x+k) $$

and I know how to use it for simple functions such as $f(x)=\dfrac1{x}$, but I have no idea how solve something like the one above.

sigma.z.1980
  • 1,717
  • The $\gamma$ and $\pi^2/12$ suggest to me it either involves primes and/or harmonic numbers & $\zeta(2)$ somehow. – anon Sep 16 '11 at 02:29
  • 3
    Are you asking for a proof at the level of the Concrete Mathematics book? Maybe there is one, but then one needs to ask why Flajolet and Sedgewick chose to use Hankel integration. So maybe there isn't one. – Gerry Myerson Sep 16 '11 at 05:36
  • 3
    For the record, $\sum_{k=1}^n \binom{n}{k} (-1)^k H_k = - \frac{1}{n}$. So approximating $\log k$ by the $k$th harmonic number doesn't give you the same asymptotically dominant term. – Mike Spivey Sep 16 '11 at 05:38
  • @Gerry: do you think it's better to ask this question at MathOverflow? – sigma.z.1980 Sep 16 '11 at 06:33
  • Patience. Having asked it here, I think it's best to see what you get here over the next few days. Then if you're not happy, go there (but leave a note here saying you've done that, and include a link back here in what you post there). – Gerry Myerson Sep 16 '11 at 07:42
  • I tried searching in Flajolet & Sedgewick, but I did not find the theorem. Can you give a more specific reference? (Somehow "Theorem 4" doesn't sound right :). Are you missing the chapter number?) I am using this as my reference: http://algo.inria.fr/flajolet/Publications/book061023.pdf. – Srivatsan Sep 16 '11 at 12:31
  • 2
    @Srivatsan: Given the 1995 citation year, the cited theorem would come from this paper. – J. M. ain't a mathematician Sep 16 '11 at 15:21
  • 2
    Here's an idea that hopefully someone with better manipulational skill than me can finish: if you replace the logarithm with the Frullani integral, you obtain the integral $$\int_0^\infty \frac{1-\exp(-t)-(1-\exp(-t))^n}{t}\mathrm dt$$. That might be a bit more tractable manipulationally... – J. M. ain't a mathematician Sep 16 '11 at 15:41
  • 3
    If $f(x)=1/x$ is simple to use here's the expression :) $$ \sum_{k=1}^n \binom{n}{k}(-1)^k \log k= \int_0^1 \sum _{k=1}^{n-1}\binom{n-1}{k} \frac{(-1)^{k-1}}{k+x} , dx= \int_0^1 \left(\frac{1}{x}-\frac{\Gamma (n) \Gamma (x)}{\Gamma (n+x)}\right) , dx. $$ – Andrew Sep 16 '11 at 17:22
  • @Andrew: 1) what is the expansion of $\log k$ in the 1st step? 2)I only see two diverging integrals. Maybe you can suggest some approximation? – sigma.z.1980 Sep 16 '11 at 22:22
  • 1
    @sigma.z.1980 $$ \sum_{k=1}^n \binom{n}{k}(-1)^k \log k= \sum_{k=2}^n \binom{n}{k}(-1)^k\int_0^1\sum_{m=1}^{k-1}\frac1{x+m}, dx= \int_0^1 \sum _{k=1}^{n-1}\binom{n-1}{k} \frac{(-1)^{k-1}}{k+x} , dx. $$ – Andrew Sep 17 '11 at 14:11
  • Your expression can also be written as $\mathbb{E}\left[\frac{n!}{(1+X)^n}\right]$, where $X$ is the sum of $n$ independent $U([0,1])$ random variables. – George Lowther Sep 17 '11 at 16:46

2 Answers2

20

Thanks go to Andrew, J. M., and David Speyer! The following solution leans heavily on what these three have already posted.

(In the interest of having a complete solution I've added the argument that gets from the OP's sum to Andrew's reformulation of it.)


Part 1: Getting to Andrew's gamma function formula.

Since $$\int_0^1\sum_{m=1}^{k-1}\frac1{x+m}\, dx = \sum_{m=1}^{k-1}(\log(m+1) - \log m) = \log k,$$ we can rewrite the original formula as $$\sum_{k=1}^n \binom{n}{k}(-1)^k \log k= \sum_{k=2}^n \binom{n}{k}(-1)^k\int_0^1\sum_{m=1}^{k-1}\frac1{x+m}\, dx = \int_0^1 \sum_{m=1}^{n-1} \frac1{x+m} \sum_{k=m+1}^n \binom{n}{k}(-1)^k\, dx.$$ Since alternating row sums of binomial coefficients are easy to evaluate (see, for instance, Concrete Mathematics, Identity 5.16), this becomes (and then switching the index back to $k$) $$\int_0^1 \sum_{m=1}^{n-1} \frac1{x+m} (-1)^{m-1} \binom{n-1}{m}\, dx = \int_0^1 \sum _{k=1}^{n-1}\binom{n-1}{k} \frac{(-1)^{k-1}}{k+x} \, dx$$ $$ = \int_0^1 \left(\frac{1}{x} - \sum _{k=0}^{n-1}\binom{n-1}{k} \frac{(-1)^k}{k+x}\right) \, dx.$$ The remaining binomial sum is actually the partial fractions decomposition of $\frac{(n-1)!}{x(x+1)\cdots (x+n-1)}$. (This is identity 5.41 in Concrete Mathematics. From another perspective - also discussed in Concrete Mathematics - the binomial sum is $(-1)^n$ times the $n-1$ difference of $\frac{1}{x} = (x-1)^{\underline{-1}}$. Applying the finite difference rule $\Delta x^{\underline{m}} = m x^{\underline{m-1}}$ successively $n-1$ times thus gets us to $\frac{(n-1)!}{x(x+1)\cdots (x+n-1)}$.)

Thus our original sum is equivalent to $$\int_0^1 \left(\frac{1}{x} - \frac{(n-1)!}{x(x+1)\cdots (x+n-1)}\right) dx = \int_0^1 \left(\frac{1}{x} - \frac{\Gamma(n) \Gamma(x)}{\Gamma(n+x)}\right) dx,$$ which is the formula Andrew mentions in the comments.


Part 2: Rewriting the expression.

Now, rewrite like so: $$\int_0^1 \left( \frac{1}{x} - \frac{\Gamma(n) \Gamma(x)}{\Gamma(n+x)} \right) dx = \int_0^1 \left( \frac{1}{x}\left(1- \frac{\Gamma(n) \Gamma(x+1)}{\Gamma(n+x)} \right)\right) dx.$$

For $0 < x < 1$, $$\Gamma(x+1) = 1 - \gamma x + \frac{\zeta(2) + \gamma^2}{2}x^2 + O(x^3).$$ (This is the Maclaurin series for $\Gamma(x+1)$; see Expression 8.321 in Gradshteyn and Ryzhik. The only reason I know this is because I had to track it down for a paper I wrote a couple of years ago.) Also, $$\frac{\Gamma(n)}{\Gamma(n+x)} = n^{-x}\left(1 + O\left(\frac{1}{n}\right)\right).$$ (See the DLMF.)

Putting all this together means we want the asymptotic value of \begin{equation} \int_0^1 \frac{1-n^{-x}\left(1+ O\left(\frac{1}{n}\right)\right)\left(1 - \gamma x + \frac{\zeta(2) + \gamma^2}{2}x^2 + O(x^3)\right)}{x} dx. \tag{1} \end{equation}


Part 3: Obtaining the dominant terms.

Following David Speyer and J.M., we'll first extract what turns out to be the dominant part of (1): $$\int_0^1 \frac{1-n^{-x}}{x}dx = \int_0^1 \frac{1-e^{-x \log n}}{x}dx = \int_0^{\log n} \frac{1-e^{-u}}{u} du = \text{Ein}(\log n),$$ where $\text{Ein}(x)$ is the complementary exponential integral. Now, $\text{Ein}(x) = E_1(x) + \log x + \gamma$, where $E_1(x)$ is the usual exponential integral (again, see the DLMF), and $E_1(x) < e^{-x} \log (1 + 1/x)$ (DLMF once again), so putting all of this together we have $$\int_0^1 \frac{(1-n^{-x})}{x}dx = \log \log n + \gamma + O\left(\frac{1}{n}\right).$$


Part 4: Obtaining the remaining terms.

Now we consider the rest of (1). This is $$\left(1+ O\left(\frac{1}{n}\right)\right)\int_0^1 n^{-x}\left(\gamma - \frac{\zeta(2) + \gamma^2}{2}x + O(x^2)\right) dx$$ $$=\left(1+ O\left(\frac{1}{n}\right)\right)\left(\gamma \left(\frac{n-1}{n \log n}\right) - \frac{\zeta(2) + \gamma^2}{2}\left(\frac{n - \log n -1}{n (\log n)^2}\right) + O\left(\frac{1}{(\log n)^3}\right)\right)$$ $$=\frac{\gamma }{\log n} - \frac{\zeta(2) + \gamma^2}{2(\log n)^2} + O\left(\frac{1}{(\log n)^3}\right),$$ which is the rest of the expression requested by the OP.

Mike Spivey
  • 55,550
  • Looks good! One small correction: Note that $\int_0^1 (1-n^{-x})/x dx$ is not equal to $\log \log n + \gamma$, rather, its equal to that plus an error coming from changing the upper bound in JM's integral from $\log n$ to $\infty$. So that's one more error term to keep track of. – David E Speyer Sep 17 '11 at 22:48
  • @David: I added the error term and cleaned up that part of the argument. Thanks again for catching my omission. – Mike Spivey Sep 18 '11 at 03:24
  • I sure wish I could upvote this more than once. The series for the gamma function is new to me (and I'm surprised the DLMF has no mention of it); I really should be digging deeper into G&R more. Thanks for this answer! – J. M. ain't a mathematician Sep 19 '11 at 16:14
  • @J.M.: Thanks! :) – Mike Spivey Sep 19 '11 at 17:59
  • great answer, thanks to every1) – sigma.z.1980 Sep 19 '11 at 22:47
  • 1
    @J.M.: You know, the series for the gamma function was a lot of what got me to my answer. As soon as I saw the first two subconstant terms in the question I thought that series was involved somehow. With that in mind, I floundered around in the vicinity of the correct approach for a while, but it wasn't until I saw what David Speyer and you had done that it fell into place. – Mike Spivey Sep 22 '11 at 04:38
  • If I may admit something: that time I upvoted this, I haven't finished reading your answer, but did so once I saw you use the gamma series, since I saw at once it was the "crux move" to resolving this, if you will... :) – J. M. ain't a mathematician Sep 22 '11 at 07:18
  • @Mike: is there a different explanation of the asymptotic expression of $\Gamma(x+1)$ for $0<x<1$ that uses expansion of Gamma function from Gradshteyn and Ryzhik? – sigma.z.1980 Oct 01 '11 at 14:03
  • @Mike: I mean, where does this very expansion in G&R come from? Some complex series? – sigma.z.1980 Oct 02 '11 at 02:04
  • @sigma.z.1980: That's a good question. Honestly, I don't know. G&R gives the reference as p. 40 of N. Nielsen, Handbuch der Theorie der GammaFunktion, which was published in 1906 and appears to be in German. There's also a superscript on the formula that might refer to A. Erdélyi, et al, Higher Transcendental Functions, Vol. I-III (listed in the supplemental references at the end of G&R). I would try one of those. – Mike Spivey Oct 02 '11 at 02:56
  • @Mike: Wolfram Alpha gives the same series expansion as $x \to 0$ : http://www.wolframalpha.com/input/?i=Gamma%28z%2B1%29 and http://functions.wolfram.com/GammaBetaErf/Gamma/06/01/04/01/ but does not quite explain how it was obtained. – sigma.z.1980 Oct 02 '11 at 14:22
  • This is the Taylor series of $\Gamma(1+z)$, so we just have to understand how to evaluate $\Gamma'(1)$ and $\Gamma''(1)$. – David E Speyer Oct 02 '11 at 14:40
  • $\Gamma(x)$ is continuous for $x>0$. Should this be a problem? – sigma.z.1980 Oct 02 '11 at 15:18
  • 1
    @David: Thanks, and a sheepish facepalm from me. :) For example, the explanation of $\Gamma'(1)$ is in MathWorld's entry on the gamma function. $\Gamma''(1)$ can be obtained from the formulas there as well. – Mike Spivey Oct 02 '11 at 16:33
  • @sigma.z.1980: I don't think so. Why might it be? – Mike Spivey Oct 02 '11 at 16:34
8

Here is a non-rigorous approach which gives the right leading order term first two terms. I'm writing it here in the hope that someone will follow up and turn it into an actual proof. This post is CW, in case some one wants to add to what I've come up with.

I start with Andrew's formula: $$\sum_{k=1}^n (-1)^k \binom{n}{k} \log k = \int_0^1 \left( \frac{1}{x} - \frac{\Gamma(n) \Gamma(x)}{\Gamma(n+x)} \right) dx = \int_0^1 \left( \frac{1}{x} - \frac{(n-1)!}{x(x+1)(x+2) \cdots (x+n-1)} \right) dx$$ $$=\int_0^1 \frac{dx}{x} \left( 1- \frac{1}{(1+x)(1+x/2) \cdots (1+x/(n-1))} \right)$$

Let's look at that denominator. $$\prod_{k=1}^{n-1} (1+x/k) = \exp \left( \sum_{k=1}^{n-1} \log \left(1+\frac{x}{k}\right) \right) \approx \exp \left( \sum_{k=1}^{n-1} \frac{x}{k} \right) \approx e^{x \log n}.$$

I could make these estimates more precise, but I'm not going to bother because I don't have a rigorous proof. So, roughly, we want to compute $$\int_0^1 \frac{1-e^{-x \log n}}{x} dx = \int_0^{\log n} \frac{1-e^{-u}}{u} du$$ where we have substituted $u = x \log n$. Let $[u>1]$ be $1$ for $u>1$ and $0$ for $u<1$. Then we can write this integral as $$\int_0^{\log n} \frac{[u>1] du}{u} + \int_0^{\log n} \frac{1-[u>1]-e^{-u}}{u} du = \log \log n + \int_0^{\log n} \frac{1-[u>1]-e^{-u}}{u} du$$

One can check that the integral $\int_0^{\infty} (1-[u>1]-e^{-u})/u \ du$ converges to some constant $\gamma$, as JM points out below. So, if we make our estimates precise, this method will give $$\log \log n + \gamma + o(1) \quad \mbox{as $n \to \infty$}.$$

"Concrete" methods should be able to improve the approximation $\sum_{k=1}^{n-1} \log \left(1+\frac{x}{k}\right) \approx x \log n$ a good deal. Once we see what that improved version looks like, we can try to figure out what to do with the rest of the argument.