17

$$ \int_0^{\infty } \frac{\log (x)}{e^x+1} \, dx = -\frac{1}{2} \log ^2(2) $$

Anyone an idea on how to prove this?

jimjim
  • 9,675
wnvl
  • 3,010
  • 1
    Can you tell us what you've tried? – Michael Anderson Jan 19 '12 at 02:44
  • To me it looks like a "typical" complex plane integral.

    Work out where the poles are, choose the right contour integral, show parts of the contour integral vanish, count the residues at the surrounded poles... and you've got your answer :) Fun part is picking the right contour.

    – Michael Anderson Jan 19 '12 at 02:46
  • The tricky part is finding the right integrand. I've been playing with this for an hour or more and can't make it quite work. I've been trying $$\frac{(\ln z)^2}{e^z + 1}$$ and making a Riemann cut on the positive real axis. This looked promising. But the trouble with this approach is the sum of the infinite number of residues up and down the imaginary axis doesn't converge. – Simon S Jan 19 '12 at 17:59
  • 1
    I considered $$\frac{\log (z)}{e^z+1}$$ Poles are located at $$z_{n} = j(2n+1)\pi$$ residues are equal to $$-\log ((2n+1) i \pi )$$ – wnvl Jan 19 '12 at 18:07
  • 2
    The other approach I've been playing around with is trying to tie this result back to an integral like $$ \int_0^\infty \frac{x^s}{e^x -1} dx $$ which is standard.

    The integral in question is equal to $dJ/ds(0)$ where $$J(s) = \int_0^\infty \frac{x^s}{e^x+1} dx $$ as $$ \frac{d \ }{ds} \int_0^\infty \frac{x^s}{e^x+1} dx = \int_0^\infty \frac{\partial \ }{\partial s} \frac{x^s}{e^x+1} dx = \int_0^\infty \frac{ (\ln x)x^s}{e^x+1} dx $$

    It wasn't too hard to write down an expression for J(s) but upon differentiating it and evaluating it at $s = 0$, I ran into trouble again with convergence.

    – Simon S Jan 19 '12 at 18:22
  • @wnvl: two questions about that approach. First, how do you recover the integral we're interested in? Second, if it involves a contour that goes to infinity, the sum of the residues doesn't converge, does it? – Simon S Jan 19 '12 at 18:29
  • It was just an idea but it does not work at all. I don't have a good contour and as you said allready the sum of the residues diverges. – wnvl Jan 19 '12 at 19:33
  • FYI I picked up the problem on another forum. It is part of the solution of another problem. http://www.mymathforum.com/viewtopic.php?f=22&t=26277 – wnvl Jan 19 '12 at 19:36

2 Answers2

24

By the recursive relation $\Gamma(x+1)=x\Gamma(x)$, we get $$ \small{\log(\Gamma(x))=\log(\Gamma(n+x))-\log(x)-\log(x+1)-\log(x+2)-\dots-\log(x+n-1)}\tag{1} $$ Differentiating $(1)$ with respect to $x$, evaluating at $x=1$, and letting $n\to\infty$ yields $$ \begin{align} \frac{\Gamma'(1)}{\Gamma(1)}&=\log(n)+O\left(\frac1n\right)-\frac11-\frac12-\frac13-\dots-\frac1n\\ &\to-\gamma\tag{2} \end{align} $$ Next, apply $(2)$ to the following: $$ \begin{align} \int_0^\infty\log(t)\;e^{-t}\;\mathrm{d}t &=\left.\frac{\mathrm{d}}{\mathrm{d}x}\int_0^\infty t^x\;e^{-t}\;\mathrm{d}t\right]_{x=0}\\ &=\Gamma'(1)\\ &=-\gamma\tag{3} \end{align} $$ Then, a simple change of variables yields $$ \int_0^\infty\log(t)\;e^{-nt}\;\mathrm{d}t=-\frac{\gamma+\log(n)}{n}\tag{4} $$ Since $\dfrac{1}{e^t+1}=e^{-t}-e^{-2t}+e^{-3t}-e^{-4t}+\dots$, by applying $(4)$ to this result, we have that $$ \begin{align} \int_0^\infty\frac{\log(t)}{e^t+1}\mathrm{d}t &=\int_0^\infty\sum_{n=1}^\infty(-1)^{n-1}\log(t)\;e^{-nt}\;\mathrm{d}t\\ &=\sum_{n=1}^\infty(-1)^n\frac{\gamma+\log(n)}{n}\\ &=-\frac12\log(2)^2\tag{5} \end{align} $$


More about $\mathbf{(2)}$:

The fact that $\frac{\mathrm{d}}{\mathrm{d}x}\log(\Gamma(x))=\log(x)+O\left(\frac1x\right)$ relies on the log-convexity of $\Gamma(x)$; that is, $\frac{\mathrm{d}}{\mathrm{d}x}\log(\Gamma(x))$ is monotonically increasing. By the recursive relation for $\Gamma(x)$, we have that $$ \log(\Gamma(x))-\log(\Gamma(x-1))=\log(x-1)\tag{6} $$ and that $$ \log(\Gamma(x+1))-\log(\Gamma(x))=\log(x)\tag{7} $$ The Mean Value Theorem and $(6)$ imply that $\frac{\mathrm{d}}{\mathrm{d}x}\log(\Gamma(\xi_1))=\log(x{-}1)$ for some $\xi_1{\in}(x{-}1,x)$.

The Mean Value Theorem and $(7)$ imply that $\frac{\mathrm{d}}{\mathrm{d}x}\log(\Gamma(\xi_2))=\log(x)$ for some $\xi_2{\in}(x,x{+}1)$.

By the monotonicity of $\frac{\mathrm{d}}{\mathrm{d}x}\log(\Gamma(x))$, we get that $$ \log(x-1)\le\frac{\mathrm{d}}{\mathrm{d}x}\log(\Gamma(x))\le\log(x)\tag{8} $$ Since $\log(x)-\log(x-1)=O\left(\frac1x\right)$, $(8)$ implies that $$ \frac{\mathrm{d}}{\mathrm{d}x}\log(\Gamma(x))=\log(x)+O\left(\frac1x\right)\tag{9} $$


Log-Convexity of $\mathbf{\Gamma(x)}$:

If $\frac{\mathrm{d}^2}{\mathrm{d}x^2}f(x)\ge0$, then $f$ is convex at $x$. Thus, if $\dfrac{f(x)f''(x)-f'(x)^2}{f(x)^2}=\frac{\mathrm{d}^2}{\mathrm{d}x^2}\log(f(x))\ge0$, then $f$ is log-convex. So we need to show that $\Gamma(x)\Gamma''(x)\ge\Gamma'(x)^2$. That is, $$ \int_0^\infty t^{x-1}\;e^{-t}\;\mathrm{d}t \int_0^\infty\log(t)^2\;t^{x-1}\;e^{-t}\;\mathrm{d}t \ge \left(\int_0^\infty\log(t)\;t^{x-1}\;e^{-t}\;\mathrm{d}t\right)^2\tag{10} $$ Dividing both sides of $(10)$ by $\int_0^\infty t^{x-1}\;e^{-t}\;\mathrm{d}t$, $(10)$ becomes $$ \int\log(t)^2\;\mathrm{d}\mu \ge \left(\int\log(t)\;\mathrm{d}\mu\right)^2\tag{11} $$ where $\mathrm{d}\mu=\dfrac{t^{x-1}\;e^{-t}\;\mathrm{d}t}{\int_0^\infty t^{x-1}\;e^{-t}\;\mathrm{d}t}$ is a unit measure on $[0,\infty)$. Thus, $(11)$ is simply Jensen's inequality.

Strictly speaking:

Note that $$ \log(t)^2 + a^2 \ge 2a\log(t)\tag{12} $$ with equality if and only if $\log(t)=a$. Integrating $(12)$ w.r.t. the unit measure $\mathrm{d}\mu$, yields $$ \int\log(t)^2\;\mathrm{d}\mu + a^2 \ge 2a\int\log(t)\;\mathrm{d}\mu\tag{13} $$ with equality in $(13)$ if and only if $\log(t)=a$ a.e. $\mathrm{d}\mu$. Let $a=\int\log(t)\;\mathrm{d}\mu$, then $(13)$ becomes $$ \int\log(t)^2\;\mathrm{d}\mu \ge \left(\int\log(t)\;\mathrm{d}\mu\right)^2\tag{14} $$ with equality if and only if $\log(t)$ is constant a.e. $\mathrm{d}\mu$. Since the $\mathrm{d}\mu$ in $(11)$ is absolutely continuous and $\log(t)$ is strictly increasing on $(0,\infty)$, the inequality in $(11)$ is strict. Therefore, $\Gamma$ is strictly log-convex.

robjohn
  • 345,667
  • $$\frac{1}{1+e^x}=\frac{e^{-x}}{1+e^{-x}}=e^{-x}-e^{-2x}+e^{-3x}-e^{-4x}+\dots$$ which converges for $x\in(0,\infty)$. – robjohn Jan 21 '12 at 09:42
  • 2
    @Peter: It's not your fault - until you have 50 reputation points, you can only comment on your own questions and answers. – Zev Chonoles Jan 23 '12 at 01:08
  • @Zev: I think that Peter had intended his comment to be for Kirill's answer. However, it is interesting to have a comment to my answer posted before I answered :-) – robjohn Jan 25 '12 at 00:01
  • Hi robjohn, I know it's an old post but looking at most of your answers I'm really astonished by your huge range of techniques and knowledge! Could you please give (a little) guide (for me as an undergraduate student) as well. I'm also kind of into integration, series, etc exactly the topics that most of your answers are written brilliantly. – VIVID Jul 04 '21 at 07:20
  • How do we know that $$\int_{0}^{\infty} \sum_{n=1}^{\infty}(-1)^{n-1} \log(t) e^{-nt} , \mathrm dt = \sum_{n=1}^{\infty} (-1)^{n-1} \int_{0}^{\infty} \log(t) e^{-nt} , \mathrm dt ?$$ – Random Variable Feb 14 '24 at 02:49
  • 1
    If we combine the terms for $n=2k-1$ and $n=2k$, we get an absolutely convergent integral/sum, and Fubini allows us to switch the order. Since the terms vanish for $n\to\infty$, we don't need to worry about separating the $n=2k-1$ and $n=2k$ terms. – robjohn Feb 14 '24 at 22:43
  • That takes care of the issue at $t=0$. Thanks. – Random Variable Feb 15 '24 at 00:28
15

Start with $J(s)$ given by $$ J(s) = \int_0^\infty \frac{x^s}{1+e^x}dx. $$ Expand the denominator using geometric series, like so: $$ J(s) = \sum_{k\geq0}\int_0^\infty (-1)^k x^s e^{-(1+k)x}dx$$ $$ = \sum_{k\geq1} \frac{(-1)^{k+1}}{k^{s+1}} \int_0^\infty x^s e^{-x}dx$$ Now, the sum is the Dirichlet eta function, related to the Riemann zeta function like so, $$ \sum_{k\geq1}\frac{(-1)^{k+1}}{k^{s+1}} = (1-2^{-s})\zeta(s+1), $$ and the integral is $\Gamma(1+s)$. Thus $$ J(s) = (1-2^{-s})\zeta(1+s) \Gamma(1+s). $$

To find the derivative at $s=0$ we need the Laurent series for each of these functions at $s=0$, ($\zeta(1+s)$ is singular at $s=0$, but $1-2^{-s}$ has a zero there, so $J$ is regular), they are $$ (1-2^{-s})\zeta(1+s) = \log2 + (\gamma \log 2 - \frac{(\log 2)^2}{2})s + O(s^2), $$ $$ \Gamma(1+s) = 1 - \gamma s + O(s^2), $$ where $\gamma$ is Euler's constant. Multiplying the two series and taking the coefficient of $s$, we get $$ \frac{d J}{ds}(0) = -\frac12 (\log 2)^2, $$ which is the integral you were looking for.

Kirill
  • 14,494
  • Very nice. I didn't know how to handle $$\sum \frac{(-1)^n}{n^s}$$ Could you expand a little more on the derivation of the expansion of $$(1-2^{-s})\zeta(1+s)$$ – Simon S Jan 19 '12 at 22:07
  • 1
    Thanks. I had to look up the $\eta$, Gamma and Riemann zeta functions and their expansions on MathWorld. The series expansion $$\zeta(1+s)=1/s+\gamma+O(s)$$ is standard, and $$1-2^{-s} = (\log 2)s - \frac12(\log2)^2s^2 + O(s^3)$$ is just Taylor series. – Kirill Jan 19 '12 at 22:28