28

I'm searching for a really simple and beautiful proof that the sequence $(u_n)_{n \in \mathbb{N}} = \sum\nolimits_{k=1}^n \frac{1}{k} - \log(n)$ converges.
At first I want to know if my answer is OK.

My try:
$\lim\limits_{n\to\infty} \left(\sum\limits_{k=1}^n \frac{1}{k} - \log (n)\right) = \lim\limits_{n\to\infty} \left(\sum\limits_{k=1}^n \frac{1}{k} + \sum\limits_{k=1}^{n-1} [\log(k)-\log(k+1)]\right)$

$ = \lim\limits_{n\to\infty} \left(\frac{1}{n} + \sum\limits_{k=1}^{n-1} \left[\log(\frac{k}{k+1})+\frac{1}{k}\right]\right) = \sum\limits_{k=1}^{\infty} \left[\frac{1}{k}-\log(\frac{k+1}{k})\right]$
Now we prove that the last sum converges by the comparison test:
$\frac{1}{k}-\log(\frac{k+1}{k}) < \frac{1}{k^2} \Leftrightarrow k<k^2\log(\frac{k+1}{k})+1$
which surely holds for $k\geqslant 1$


As $ \sum\limits_{k=1}^{\infty} \frac{1}{k^2}$ converges $ \Rightarrow \sum\limits_{k=1}^{\infty} \left[\frac{1}{k}-\log(\frac{k+1}{k})\right]$ converges and we name this limit $\gamma$
q.e.d

robjohn
  • 345,667

6 Answers6

38

One elegant way to show that the sequence converges is to show that it's both decreasing and bounded below.

It's decreasing because $u_n-u_{n-1} = \frac1n - \log n + \log(n-1) = \frac1n + \log(1-\frac1n) < 0$ for all $n$. (The inequality is valid because $\log(1-x)$ is a concave function, hence lies beneath the line $-x$ that is tangent to its graph at $0$; plugging in $x=\frac1n$ yields $\log(1-\frac1n) \le -\frac1n$.)

It's bounded below because $$ \sum_{j=1}^n \frac1j > \int_1^{n+1} \frac{dt}t = \log (n+1) > \log n, $$ and so $u_n>0$ for all $n$. (The inequality is valid because the sum is a left-hand endpoint Riemann sum for the integral, and the function $\frac1t$ is decreasing.)

hardmath
  • 37,015
Greg Martin
  • 78,820
10

Upper Bound

Note that $$ \begin{align} \frac1n-\log\left(\frac{n+1}n\right) &=\int_0^{1/n}\frac{t\,\mathrm{d}t}{1+t}\\ &\le\int_0^{1/n}t\,\mathrm{d}t\\[3pt] &=\frac1{2n^2} \end{align} $$ Therefore, $$ \begin{align} \gamma &=\sum_{n=1}^\infty\left(\frac1n-\log\left(\frac{n+1}n\right)\right)\\ &\le\sum_{n=1}^\infty\frac1{2n^2}\\ &\le\sum_{n=1}^\infty\frac1{2n^2-\frac12}\\ &=\sum_{n=1}^\infty\frac12\left(\frac1{n-\frac12}-\frac1{n+\frac12}\right)\\[9pt] &=1 \end{align} $$


Lower Bound

Note that $$ \begin{align} \frac1n-\log\left(\frac{n+1}n\right) &=\int_0^{1/n}\frac{t\,\mathrm{d}t}{1+t}\\ &\ge\int_0^{1/n}\frac{t}{1+\frac1n}\,\mathrm{d}t\\[3pt] &=\frac1{2n(n+1)} \end{align} $$ Therefore, $$ \begin{align} \gamma &=\sum_{n=1}^\infty\left(\frac1n-\log\left(\frac{n+1}n\right)\right)\\ &\ge\sum_{n=1}^\infty\frac1{2n(n+1)}\\[3pt] &=\sum_{n=1}^\infty\frac12\left(\frac1n-\frac1{n+1}\right)\\[6pt] &=\frac12 \end{align} $$


A Better Upper Bound

Using Jensen's Inequality on the concave $\frac{t}{1+t}$, we get $$ \begin{align} \frac1n-\log\left(\frac{n+1}n\right) &=\frac1n\left(n\int_0^{1/n}\frac{t\,\mathrm{d}t}{1+t}\right)\\ &\le\frac1n\frac{n\int_0^{1/n}t\,\mathrm{d}t}{1+n\int_0^{1/n}t\,\mathrm{d}t}\\ &=\frac1{n(2n+1)} \end{align} $$ Therefore, since the sum of the Alternating Harmonic Series is $\log(2)$, $$ \begin{align} \gamma &=\sum_{n=1}^\infty\left(\frac1n-\log\left(\frac{n+1}n\right)\right)\\ &\le\sum_{n=1}^\infty\frac1{n(2n+1)}\\ &=\sum_{n=1}^\infty2\left(\frac1{2n}-\frac1{2n+1}\right)\\[6pt] &=2(1-\log(2)) \end{align} $$

robjohn
  • 345,667
  • can you copy the last part of your answer here ? – Gabriel Romon Nov 10 '17 at 09:52
  • @robjohn may I know how do you get the inequality $$n\ \int_{0}^{\frac {1} {n}} \dfrac {t\ \mathrm {d} t} {1 + t} \leq \dfrac {n\ \int_{0}^{\frac {1} {n}} t\ \mathrm {d} t} {1 + n\ \int_{0}^{\frac {1} {n}} t\ \mathrm {d}t}\ ?$$ Is it the consequence of Jensen's inequality? Thanks. – Anacardium Nov 01 '20 at 13:44
  • It is Jensen with the concave function $\varphi(t)=\frac{t}{1+t}$ and then $n\int_0^{\frac1n}\varphi(t),\mathrm{d}t\le\varphi\left(n\int_0^{\frac1n}t,\mathrm{d}t\right)$. – robjohn Nov 01 '20 at 13:51
  • Since $\varphi$ is concave on $[0,\infty)$ it follows that for any $x,y \geq 0$ we have $\varphi (tx + (1-t)y) \geq t\ \varphi (x) + (1-t)\ \varphi (y),$ for all $t \in [0,1].$ Am I right? – Anacardium Nov 01 '20 at 14:02
  • Yes, that is one of the characterizations of a concave function. It is also true that $\varphi''(x)\le0$ on $[0,\infty)$. – robjohn Nov 01 '20 at 14:28
  • Very elegant proof of the bounds of Euler-Mascheroni's constant. +1 – Anacardium Nov 01 '20 at 14:50
  • @robjohn A bounded sequence is not always convergent – Ambica Govind Dec 22 '21 at 13:28
  • 3
    @AmbicaGovind: I don't quite understand your objection. The sum is a sum of positive terms. As long as the partial sums are bounded above (they are bounded above by $1$), the sum converges. – robjohn Dec 22 '21 at 17:26
  • @robjohn Should it not be monotonically increasing then, in addition to being bounded above? This is the theorem I've always used. Well yes, intuitively being bounded above should be enough, but the theorem? – Ambica Govind Jan 23 '22 at 02:19
  • @AmbicaGovind: The sequence of partial sums is increasing since the sum consists of positive terms. – robjohn Jan 23 '22 at 04:40
4

All we use here is the power series expansion for $\log(1+z)$ with $|z|<1$:

$$ \log(1+z)=z-{z^2\over2}+{z^3\over3}-{z^4\over4}+\cdots\tag1 $$

It suffices to note that

$$ \log N=\log{N+1\over1}+\mathcal O\left(\frac1N\right)=\sum_{n\le N}\log{n+1\over n}+\mathcal O\left(\frac1N\right) $$

Thus, we have

$$ \sum_{n\le N}\frac1n-\log N=\sum_{n\le N}\left\{\frac1n-\log\left(1+\frac1n\right)\right\}+\mathcal O\left(\frac1N\right)\tag2 $$

As $n\to+\infty$, we know from (1) that

$$ \log\left(1+\frac1n\right)=\frac1n+\mathcal O\left(1\over n^2\right) $$

This is sufficient for us to show that the partial sum on the right hand side of (2) will converge when $N\to+\infty$. Thus, we know the left hand side of (2) converges too.

TravorLZH
  • 6,718
2

Here is a proof which is slightly ore elementary than the ones above.

Consider the following well-known inequality $$\frac1{n+1}<\log (n+1)-\log n<\frac1n\tag{$*$}$$ (it can be easily proves that using that $f_n=(1+1/n)^{n+1}$ and $e_n=(1+1/n)^n$ are decreasing and increasing, respectively, and both converge towards $e$)

First, by $(1)$ it follows that $$\sum_{k=1}^n \frac1k>\sum_{k=1}^n (\log(k+1)-\log k)=\log(n+1)>\log n,$$ so $u_n=\sum_{k=1}^n \frac1k - \log n$ is bounded below by $0$. But on the other hand by $(1)$ it also follows that $$u_{n+1}-u_n =\frac1{n+1}-\log (n+1)+\log n<0,$$ so the sequence is strictly decreasing. Hence by the Monotone convergence theorem $(u_n)_n$ is convergent, so we are done.

Tashi
  • 491
1

Your proof is nice. A simple proof by using graph:

By using Rieamann sums of $y=\frac1x$: $$\sum_{k=2}^n\frac1k<\ln(n)<\sum_{k=1}^{n-1}\frac1k$$ $$-\sum_{k=1}^{n-1}\frac1k<-\ln(n)<-\sum_{k=2}^{n}\frac1k$$ $$\frac1n<u_n<1.$$ So, $(u_n)$ is bounded. On the other hand, $$u_{n}-u_{n+1}=\ln(n+1)-\ln(n)-\frac1{n+1}=\int_n^{n+1}\frac1x dx-\frac1{n+1}>\frac1{n+1}-\frac1{n+1}=0.$$ So, $(u_n)$ is a bounded decreasing sequence. We are done.

Bob Dobbs
  • 10,988
-2

You can also prove it by drawing the curve of 1/x and then you can draw some rectangles (with lines on $x=1,2,...,n$) and shade them (which will be little above the curve) and find the area of the shaded rectangles that will be the sum $1+1/2+/1/3+...+1/n$ and find the area underneath the curve which will be greater than the area of the shaded rectangles. That way you can prove that the sequence is bounded below. Further to find if it is decreasing or increasing, you can find the value of $s_{n+1}-s{n}$ with the help of same method and it will be lower than 0 so you can conclude that it is decreasing. After that you can easily prove that it is converging to Euler's constant which is $1+1/2+1/3+....+1/n-logn$.

(not an exact answer...just to clear some reasoning)