I can show that the following limit exists but I am having difficulties to find it. It is $$\lim_{n\to \infty} \sum_{k=1}^n \frac{k^n}{n^n}$$ Can someone please help me?
-
How could you show that the limit exist? – Mikasa Jun 28 '12 at 08:25
-
1@Siminore: The sum is very similar to $\int_{0}^{1}x^xdx$. Isn't it? – Mikasa Jun 28 '12 at 08:40
-
Numerically, for $n=1000$, I get 1.58098. – Siminore Jun 28 '12 at 08:44
-
@Siminore: That happens to be only a little shy of the correct answer, $\dfrac{1}{1 - 1/e}$. – davidlowryduda Jun 28 '12 at 08:50
-
Probably. But the result is not $\int_0^1 x^x, dx$. – Siminore Jun 28 '12 at 09:01
-
For future reference, let me also add duplicate/similar answers that are not previously linked to this: Nov 6 '13, Sep 11 '14. – Sangchul Lee Jul 05 '20 at 23:59
6 Answers
An asymptotic expansion can be obtained as below. More terms can be included by using more terms in the expansions of $\exp$ and $\log$. $$ \begin{align} \sum_{k=0}^n\frac{k^n}{n^n} &=\sum_{k=0}^n\left(1-\frac{k}{n}\right)^n\\ &=\sum_{k=0}^n\exp\left(n\log\left(1-\frac{k}{n}\right)\right)\\ &=\sum_{k=0}^{\sqrt{n}}\exp\left(n\log\left(1-\frac{k}{n}\right)\right)+O\left(ne^{-\sqrt{n}}\right)\\ &=\sum_{k=0}^{\sqrt{n}}\exp\left(-k-\frac{1}{2n}k^2+O\left(\frac{k^3}{n^2}\right)\right)+O\left(ne^{-\sqrt{n}}\right)\\ &=\sum_{k=0}^{\sqrt{n}}e^{-k}\exp\left(-\frac{1}{2n}k^2+O\left(\frac{k^3}{n^2}\right)\right)+O\left(ne^{-\sqrt{n}}\right)\\ &=\sum_{k=0}^{\sqrt{n}}e^{-k}\left(1-\frac{1}{2n}k^2+O\left(\frac{k^ 4}{n^2}\right)\right)+O\left(ne^{-\sqrt{n}}\right)\\ &=\sum_{k=0}^{\sqrt{n}}e^{-k}-\frac{1}{2n}\sum_{k=0}^{\sqrt{n}}k^2e^{-k}+O\left(\frac{1}{n^2}\right)\\ &=\frac{e}{e-1}-\frac{1}{2n}\frac{e(e+1)}{(e-1)^3}+O\left(\frac{1}{n^2}\right) \end{align} $$ Several steps use $$ \sum_{k=n}^\infty e^{-k}k^m=O(e^{-n}n^m) $$ which decays faster than any power of $n$.

- 345,667
-
Why $$\sum\limits_{k=0}^n e^{-k}O\left(\frac{k^4}{n^2}\right)=O\left(\frac{1}{n^2}\right)$$ ? – Norbert Jun 28 '12 at 20:33
-
1@Norbert: Because $$\sum_{k=0}^ne^{-k}O\left(\frac{k^4}{n^2}\right)\le\left(\sum_{k=0}^\infty e^{-k}k^4\right)O\left(\frac{1}{n^2}\right)$$ ! – robjohn Jun 28 '12 at 21:06
-
-
I personally think that $\ln(1-x)=-x-x^2/2+O(x^3)$ only holds for $x\prec1$, so you should break the summation into $0<k\le d(n)$ and $k>d(n)$, where $d(n)\prec n$. PS: $f(n)\prec g(n)\iff\lim_{n\to\infty}f(n)/g(n)=0$. – Yai0Phah Jun 29 '12 at 01:39
-
-
@FrankScience: Done. I did indeed forget the tail, which dies almost exponentially. – robjohn Jun 29 '12 at 02:33
-
This still isn't quite correct. You can't use a Taylor expansion of $\log(1-k/n)$, because the upper bound on $k/n$ does not go to 0 as $n\to\infty$. However if you split the tail at e.g. $\lfloor\sqrt n\rfloor$, this works. – Generic Human Jun 29 '12 at 02:56
-
-
@robjohn LOL. Omit the tail, just like the answer on Concrete Mathematics, which made me misunderstood for a long time. – Yai0Phah Jun 29 '12 at 03:17
-
@FrankScience: Yes, I hadn't really read your comment since robjohn said he had addressed the problem (plus $d(n)\prec n$ looks a lot like $d(n)<n$, so it doesn't stand out as much as $d(n)=o(n)$). – Generic Human Jun 29 '12 at 03:27
-
@GenericHuman I thought about using $o(f(n))$, but eventually I used $\prec$ because both $O$ and $o$ appear seems ambiguous. – Yai0Phah Jun 29 '12 at 03:38
-
@GenericHuman: For $0\le x\le\frac12$, $\left|\log(1-x)+x+x^2/2\right|\le x^3$, so $\log(1-x)=-x-x^2/2+O(x^3)$. The Taylor series for $\log$ is fine. I did have to adjust the upper limit of the sum for $\exp\left(-\frac{1}{2n}k^2+O\left(\frac{k^3}{n^2}\right)\right)$ – robjohn Jun 29 '12 at 04:31
-
I really like this answer, Rob. I published a paper on this sum several years ago (see, for example, here, and I've kept my eye out for alternative ways of getting the limiting expression. Your answer here is one of my favorites. It does a really nice job of handling the error terms. – Mike Spivey Jan 09 '13 at 02:50
-
@MikeSpivey: Thanks! I am a big fan of the Euler-Maclaurin Sum Formula. For polynomials, it is exact (in fact, in this answer, I give a condition for the EMS Formula to converge). – robjohn Jan 09 '13 at 14:31
-
-
-
-
1$$\begin{align}\sum_{k=\sqrt{n}+1}^n\exp\left(n\log\left(1-\frac{k}{n}\right)\right)&\le\overbrace{\quad\ \ n\quad\ \ \vphantom{\frac kn}}^{\le n\text{ terms}}\overbrace{\exp\left(n\left(-\frac kn\right)\right)}^{\log(1-x)\le -x}\&\le ne^{-\sqrt{n}}\end{align}$$ since $k\ge\sqrt{n}$ – robjohn Oct 06 '22 at 15:54
-
-
-
Just for reference: With aid of some fancy theorem, you can skip most of hard analysis. As in other answers, we begin by writing
$$ \sum_{k=1}^{n} \left( \frac{k}{n}\right)^n \ \overset{k \to n-k}{=} \ \sum_{k=0}^{n-1} \left( 1 - \frac{k}{n}\right)^n \ = \ \sum_{k=0}^{\infty} \left( 1 - \frac{k}{n}\right)^n \mathbf{1}_{\{k < n\}}, $$
where $\mathbf{1}_{\{k < n\}}$ is the indicator function which takes value $1$ if $k < n$ and $0$ otherwise. Now for each $0 \leq k < n$, utilizing the inequality $\log(1-x) \leq -x$ which holds for all $x \in [0,1)$ shows that
$$ \left( 1 - \frac{k}{n}\right)^n = e^{n \log(1 - \frac{k}{n})} \leq e^{-k}. $$
Since $\sum_{k=0}^{\infty} e^{-k} < \infty$, by the dominated convergence theorem we can interchange the infinite sum and the limit:
$$ \lim_{n\to\infty} \sum_{k=1}^{n} \left( \frac{k}{n}\right)^n = \sum_{k=0}^{\infty} \lim_{n\to\infty} \left( 1 - \frac{k}{n}\right)^n \mathbf{1}_{\{k < n\}} = \sum_{k=0}^{\infty} e^{-k} = \frac{1}{1 - e^{-1}}. $$

- 167,468
-
1
-
Maybe this is a basic question, but how did you get the first equality? – user372003 Sep 27 '17 at 05:34
-
@user372003, I added a bit of details to my answer. But basically, I replaced the index $k$ by $n-k$. – Sangchul Lee Sep 27 '17 at 05:37
-
Hi Sangchul! I hope that you're doing well. (+1) Unsure as to why this is not the overwhelmingly most useful answer. It is clear and concise. – Mark Viola Mar 05 '24 at 16:14
Finally, I have suffered this proof. Consider functions $$ f_n(x)=\left(1-\frac{\lfloor x\rfloor}{n}\right)^n\chi_{[0,n+1]}(x) $$ Note that $$ \int\limits_{[0,+\infty)} f_n(x)d\mu(x)=\sum\limits_{k=0}^n\int\limits_{[k,k+1)}\left(1-\frac{\lfloor x\rfloor}{n}\right)^nd\mu(x)= \sum\limits_{k=0}^n\left(1-\frac{k}{n}\right)^n $$ $$ \lim\limits_{n\to\infty}f_n(x)=\lim\limits_{n\to\infty}\left(1-\frac{\lfloor x\rfloor}{n}\right)^n\cdot \lim\limits_{n\to\infty}\chi_{[0,n+1]}(x)=e^{\lfloor x\rfloor} $$ One may check that $\{f_n:n\in\mathbb{N}\}$ is a non-decreasing sequence of non-negative functions, then using monotone convergence theorem we get $$ \lim\limits_{n\to\infty}\sum\limits_{k=0}^n\left(\frac{k}{n}\right)^n= \lim\limits_{n\to\infty}\sum\limits_{k=0}^n\left(1-\frac{k}{n}\right)^n= \lim\limits_{n\to\infty}\int\limits_{[0,+\infty)} f_n(x)d\mu(x)= $$ $$ \int\limits_{[0,+\infty)} \lim\limits_{n\to\infty}f_n(x)d\mu(x)= \int\limits_{[0,+\infty)} e^{\lfloor x\rfloor}d\mu(x)= \sum\limits_{k=0}^\infty e^{-k}=\frac{1}{1-e^{-1}} $$

- 56,803
-
1You must justify the interchange of the limit and the sum in the second equality. One way to do this is to note that $(1-k/m)^m<(1-k/n)^n$ if $k+1\leq m<n$ and then to invoke the Monotone Convergence Theorem. – Benji Jun 28 '12 at 10:09
-
You could just use the sequence version of the Monotone Convergence Theorem. – Benji Jun 28 '12 at 10:59
-
-
Could you compute the asymptotics, not just the limits. I'm very interested in such theory. – Yai0Phah Jun 28 '12 at 11:52
-
-
@Norbert Because yesterday I thought Poisson summation formula is more powerful in this problem than Euler-Maclaurin summation formula, where $\Theta_n(t)$ is periodic function of $t$. – Yai0Phah Jun 29 '12 at 01:45
Let's notice a few things. All the terms are positive, bounded between $0$ and $1$, and there is a term that is exactly $1$. What about the next largest term?
So we ask ourselves what $\lim \limits_{n \to \infty} \left( \dfrac{n-1}{n} \right)^n$ is, and after a little calculation we see that this limit is $1/e$. The 'next' term involves $\lim \limits_{n \to \infty} \left( \dfrac{n-2}{n} \right)^n = e^{-2}$. So heuristically, we would expect the limit to be
$$1 + e^{-1} + e^{-2} + \dots = \frac{1}{1-\frac{1}{e}}$$
Working only a little harder, you can justify that this is the limit.

- 22,431

- 91,687
-
I wonder if it is possible to prove that $$\lim_{n \to +\infty} \sum_{k=1}^n \left( \frac{k^n}{n^n}-e^{-k} \right)=0.$$ – Siminore Jun 28 '12 at 08:58
-
-
@Siminore: Have you tried? It's really not bad at all, as long as you know the dominated convergence theorem and/or the monotone convergence theorem. In fact, that series is absolutely convergent, so you can do a whole lot of things to it – davidlowryduda Jun 28 '12 at 16:05
-
$\sum_{k=1}^n(k/n)^n=\sum_{0<k\le n}(1-k/n)^n$, and let $a_k(n)=(1-k/n)^n$. For $0<k\le n^{1/3}$, we have $$\ln a_k(n)=n\ln\left(1-\frac kn\right)=-n\left(\frac kn+O\left(\frac kn\right)\right)=-k+O\left(\frac{k^2}n\right)$$ thus $$a_k(n)=e^{-k}\left(1+O\left(\frac{k^2}n\right)\right)$$ Let $b_k(n)=e^{-k}$, $c_k(n)=k^2e^{-k}/n$, we have $a_k(n)=b_k(n)+O(c_k(n))$ over $0<k\le n^{1/3}$. Thus, we have $$\sum_{0<k\le n}a_k(n)=\sum_{k>0}b_k(n)+O(\Sigma_a(n))+O(\Sigma_b(n))+O(\Sigma_c(n))$$ where $$\sum_{k>0}b_k(n)=\sum_{k>0}e^{-k}=\frac e{e-1}$$ and \begin{align*} \Sigma_b(n)&=\sum_{k>n^{1/3}}e^{-k}=O(e^{n^{1/3}})\\ \Sigma_a(n)&=\sum_{n^{1/3}<k\le n}\left(1-\frac kn\right)^n\le\sum_{n^{1/3}<k\le n}e^{-k}=O(e^{n^{1/3}})\\ \Sigma_c(n)&=\sum_{0<k\le n^{1/3}}e^{-k}k^2/n\le\sum_{k>0}e^{-k}k^2/n=O\left(\frac 1n\right) \end{align*} Hence, we have $\sum_{0<k\le n}(1-k/n)^n=e/(e-1)+O(1/n)$.
Can anybody give a more accurate approximation? The key to the approximation is to find the asymptotics for $\sum_{k>0}\exp(-k-k^2/2n)$, like the Bell sum $\sum_{k>0}e^{-k^2/n}$.
Edit anon pointed out that it's theta function: $\sum_ke^{-(k+t)^2/n}$, so the Fourier series works pretty well for the asymptotics: $$\Theta_n(t)=\sqrt{\pi n}\left(1+2e^{-\pi^2 n}(\cos2\pi t)+2e^{-4\pi^2 n}(\cos4\pi t)+2e^{-9\pi^2 n}(\cos6\pi t)+\cdots\right)$$ But I have no idea about Fourier series because I know very little about calculus!

- 9,733
-
In my answer, I show how to get more accuracy by including more terms in in the series for $\log$ and $\exp$. – robjohn Jun 28 '12 at 18:06
-
-
First, we can rewrite the formula:
$$\frac{\sum_{i=1}^{n}i^n}{n^n} = \sum_{i=1}^{n}\frac{i^n}{n^n} = \sum_{i=0}^{n-1}\frac{(n-i)^n}{n^n} = \sum_{i=0}^{n-1}\left(1-\frac{i}{n} \right)^n$$
We can easily prove the following inequality relations:
$$\frac{1}{\text{e}^i} - \frac{i^2}{n\text{e}^i} \le \frac{1}{\text{e}^i}\left(1-\frac{i}{n} \right)^i\le \left(1-\frac{i}{n} \right)^n \le \frac{1}{\text{e}^i} $$
Therefore, we can get
$$\sum_{i=0}^{n-1}\frac{1}{\text{e}^i}-\frac{1}{n}\sum_{i=0}^{n-1}\frac{i^2}{\text{e}^i}\le\sum_{i=0}^{n-1}\left(1-\frac{i}{n} \right)^n \le \sum_{i=0}^{n-1}\frac{1}{\text{e}^i}$$
So
$$\lim_{n\to \infty}\sum_{i=0}^{n-1}\frac{1}{\text{e}^i}-\lim_{n\to \infty}\frac{1}{n}\sum_{i=0}^{n-1}\frac{i^2}{\text{e}^i}\le\lim_{n\to \infty}\sum_{i=0}^{n-1}\left(1-\frac{i}{n} \right)^n \le \lim_{n\to \infty}\sum_{i=0}^{n-1}\frac{1}{\text{e}^i}$$
We can easily know that
$$\lim_{n\to \infty}\sum_{i=0}^{n-1}\frac{1}{\text{e}^i} = \frac{\text{e}}{\text{e}-1}$$
$$\lim_{n\to \infty}\frac{1}{n}\sum_{i=0}^{n-1}\frac{i^2}{\text{e}^i} = \left(\lim_{n\to \infty}\frac{1}{n}\right)\cdot \left(\lim_{n\to \infty}\sum_{i=0}^{n-1}\frac{i^2}{\text{e}^i}\right) = 0 \cdot \frac{\text{e}(\text{e}+1)}{(\text{e}-1)^3} = 0$$
Finally, we can get the answer: $$\lim_{n\to \infty} \frac{\sum_{i=1}^{n}i^n}{n^n} = \lim_{n\to \infty}\sum_{i=0}^{n-1}\left(1-\frac{i}{n} \right)^n = \frac{\text{e}}{\text{e}-1}$$
For more details, you can watch my video on Bilibili, if you have learned Chinese.
Here is my video link: https://www.bilibili.com/video/BV1yG411g7K1/

- 31