5

In the MSE-question in a comment to an naswer Michael Hardy brought up the following well known limit- expression for the Euler-gamma $$ \lim_{n \to \infty} \left(\sum_{k=1}^n \frac 1k\right) - \left(\int_{t=1}^n \frac 1t dt\right) = \gamma \tag 1$$

I've tried some variations, and heuristically I found for small integer $m \gt 1$ $$ \lim_{n \to \infty} (\sum_{k=1}^n \frac 1{k^m}) - (\int_{t=1}^n \frac 1{t^m} dt) = \zeta(m) - \frac 1{m-1} \tag 2$$

With more generalization to real $m$ it seems by Pari/GP that eq (1) can be seen as a limit for $m \to 1$ and the Euler-$\gamma$ can be seen as the result for the Stieltjes-power-series representation for $\zeta(1+x)$ whith the $\frac 1{1-(1+x)}$-term removed and then evaluated at $x=0$

Q1: Is there any intuitive explanation for this (or, for instance, a graphical demonstration)?

Another generalization gave heuristically also more funny hypotheses: $$ \tag 3$$ $$ \small \begin{eqnarray} \lim_{n \to \infty} (\sum_{k=2}^n \frac 1{k(k-1)}) &-& (\int_{t=2}^n \frac 1{t(t-1)} dt) &=& \frac 1{1!} \cdot(\frac 11 - 1\cdot \log(2)) \\ \lim_{n \to \infty} (\sum_{k=3}^n \frac 1{k(k-1)(k-2)}) &-& (\int_{t=3}^n \frac 1{t(t-1)(t-2)} dt) &=& \frac 1{2!} \cdot(\frac 12 - 2\cdot \log(2) + 1\cdot \log(3) ) \\ \lim_{n \to \infty} (\sum_{k=4}^n \frac 1{k...(k-3)}) &-& (\int_{t=4}^n \frac 1{t...(t-3)} dt) &=& \frac 1{3!} \cdot(\frac 13 - 3\cdot \log(2) + 3\cdot \log(3)- 1\cdot \log(4) ) \\ \end{eqnarray} $$ where the coefficients in the rhs are the binomial-coefficients and I think the scheme is obvious enough for continuation ad libitum.
Again it might be possible to express this with more limits: we could possibly write, for instance the rhs in the third row as $$ \lim_{h\to 0} \frac 1{3!} \cdot(- \small \binom{3}{-1+h} \cdot \log(0+h) +1 \cdot \log(1) - 3\cdot \log(2) + 3\cdot \log(3)- 1\cdot \log(4) ) \tag 4$$

Q2: Is that (3) true and how to prove (if is it not too complicated...)? And is (4) somehow meaningful?

  • I don't understand your first question. $\sum_{k=1}^{\infty}\frac{1}{k^m}$ is the definition of $\zeta{(m)}$, hence $\zeta{(m)}=\sum_{k=1}^{\infty}\frac{1}{k^m}$ holds automically for not just small integers, but all integers (greater than $1$ that is). So Q1 is seems to be nothing more than why an integral has the value it does... – David H Aug 09 '14 at 11:15
  • @DavidH: hmm. True ;-) after I've realized your remark. Well, for m=1 the separate limits were divergent, but well, your argument shows I should have thought over it a longer time. Anyway, I should look again at some graphic representation for the integral/series difference, which I've seen elsewhere to refresh my intuition for this – Gottfried Helms Aug 09 '14 at 11:20
  • You can get (3) by writing $e_k$ in terms of $p_1,\cdots,p_k$. This can be done recursively via Newton-Girard identities, or more explicitly with schur polynomials. – anon Aug 09 '14 at 11:25
  • 1
    @DavidH: I read Q1 as asking for an intuitive explanation of $\displaystyle \lim_{x\to1+} \zeta(x) - \frac1{x-1}=\gamma$ – Henry Aug 09 '14 at 11:28
  • @Henry my comment refers to a previous edit. – David H Aug 09 '14 at 11:42
  • typo. you have $m \to 0$ beneath equation (2) – David Holden Aug 09 '14 at 14:40
  • @David: corrected, thanks! – Gottfried Helms Aug 09 '14 at 15:10
  • 1
    Note that you are looking at a particular case of Euler-Maclaurin formula: http://en.wikipedia.org/wiki/Euler%E2%80%93Maclaurin_formula – Alexandre C. Aug 24 '14 at 14:22

1 Answers1

3

For Q1. the proof just relies on summation by parts.

For Q2., you can evaluate $$S_k = \sum_{n=1}^{+\infty}\frac{1}{n(n+1)\ldots(n+k)} = \frac{1}{k!}\sum_{n=1}^{+\infty}\frac{1}{n\binom{n+k}{k}}$$ by exploiting partial fractions decomposition and the residue theorem, or just the wonderful telescoping trick $\frac{1}{n(n+k)}=\frac{1}{k}\left(\frac{1}{n}-\frac{1}{n+k}\right)$, giving:

$$\begin{eqnarray*}S_k &=& \frac{1}{k}\left(\sum_{n=1}^{+\infty}\frac{1}{n(n+1)\ldots(n+k-1)}-\sum_{n=1}^{+\infty}\frac{1}{(n+1)(n+2)(n+k)}\right)\\ &=&\frac{1}{k}\cdot\frac{1}{1\cdot 2\cdot\ldots\cdot k}=\frac{1}{k\cdot k!}.\end{eqnarray*}$$ The same telescoping technique applies to the integral: $$I_k = \int_{1}^{+\infty}\frac{dt}{t(t+1)\ldots(t+k)}=\frac{1}{k}\int_{0}^{1}\frac{dt}{(t+1)\ldots(t+k)}$$ and now the RHS can be evaluated through partial fraction decomposition, since: $$\frac{1}{(t+1)\ldots(t+m)}=\frac{1}{(m-1)!}\sum_{j=0}^{m-1}\frac{(-1)^j\binom{m-1}{j}}{t+j+1}.$$ We have $\int_{0}^{1}\frac{dt}{t+h}=\log(h+1)-\log(h)=\log\left(1+\frac{1}{h}\right)$, hence: $$\begin{eqnarray*}I_k &=& \frac{1}{k(k-1)!}\sum_{j=0}^{k-1}(-1)^j\binom{k-1}{j}\left(\log(j+2)-\log(j+1)\right)\end{eqnarray*}$$ just gives your $(3)$ after rearranging terms.

Jack D'Aurizio
  • 353,855