5

The sum is:

$$ S = 1 + 1/2 + \frac {(n-1)(n-2)} {3n^2} + \frac {(n-1)(n-2)(n-3)} {4n^3} + \ldots + \frac {(n-1)!} {n \times n^{n-1}}$$ $$= \frac 3 2 + \sum_{k=3}^{n} \frac{n!}{k(n-k)!n^k} $$

Can we get an asymptotic lower bound of $S$?

I guess it's $\Omega(\frac 1 2 \log n)$, but I'm not sure how to get it. $k$ starts from 3 because I'm actually counting the expectation of cycles in a graph. And least 3 nodes could form a cycle. But since what I need is an asymptotic lower bound, I guess where $k$ starts doesn't really matter.

Mike Spivey
  • 55,550
  • The first two terms don't seem to fit a simple pattern, and why are there two $(n-2)$ factors in second numerator? Is the general pattern supposed to be something like $S(n)=\sum_{k=1}^{n} \frac{n!}{k(n-k)!n^k}$ ? – hmakholm left over Monica Oct 10 '11 at 02:19
  • Why isn't the second term $(n-1)/(2n)$? Should the fourth term have two $(n-2)$'s or is the second of these supposed to be $(n-3)$? Lastly, I think one should write $S(n)$ since it's a function of $n$ (and they aren't exactly partial sums because the terms themselves vary with $n$). – anon Oct 10 '11 at 02:20
  • 1
    If this is $S(n) =\sum_{k=1}^{n-1} \frac{n!}{(n-k)!k n^k}$ the second term should be $\frac{n-1}{2n}$, not $\frac{1}{2}$, and the fourth should be $\frac{(n-1)(n-2)(n-3)}{4n^4}$. – Robert Israel Oct 10 '11 at 02:36
  • The 2 $(n-2)$ is a typo, I just fixed it. – NonalcoholicBeer Oct 10 '11 at 02:40

2 Answers2

5

For a complete asymptotic up to $o(1)$ I get $$S(n) = \frac{1}{2} \log n + \frac{\gamma + \log 2}{2} + o(1).$$

Your sum $S(n)$ is very close to the sum $Q(n) = 1 + \frac{n-1}{n} + \frac{(n-1)(n-2)}{n^2} + \cdots = \sum_{k \geq 1} \frac{n^{\underline{k}}}{n^k}$ considered in Problem 9.56 in Concrete Mathematics and elsewhere in Knuth's work, so I've adapted some of the arguments I found there.

Let's consider the sum $S'(n) = \sum_{k \geq 1} \frac{n^{\underline{k}}}{k n^k} = S(n) - \frac{1}{2n} = S(n) + o(1).$ In the answer to Problem 9.56 in Concrete Mathematics the authors indicate that Stirling's approximation can be used to show that if $k \leq n^{1/2+\epsilon}$ then

$$\frac{n^{\underline{k}}}{k n^k} = e^{-k^2/2n} \left(\frac{1}{k} + \frac{1}{2n} - \frac{2}{3} \frac{k^2}{(2n)^2} + O(n^{-1+4 \epsilon})\right).$$ Then, Knuth and Pittel, in "A Recurrence Related to Trees," (Proceedings of the AMS 105(2) 1989, pp. 335-349) indicate this means that $\frac{n^{\underline{k}}}{k \, n^k}$ is exponentially small when $k \geq n^{1/2+\epsilon}$ and so can be replaced with other exponentially small terms to get $$S'(n) = T_{2n}(-1) + \left(\frac{1}{2n} + O(n^{-1 + 4 \epsilon})\right) T_{2n}(0) - \frac{1}{6n^2} T_{2n}(2),$$ where $T_n(x) = \sum_{k \geq 1} k^x e^{-k^2/n}$.

Lemma 1 in the Knuth and Pittel paper then states that if $x > -1$ then $$T_n(x) = \frac{1}{2} \Gamma\left(\frac{x+1}{2}\right) n^{(x+1)/2} + O(1).$$ They also mention that a derivation of $$T_n(-1) = \frac{1}{2} \log n + \frac{\gamma}{2} + O(n^{-1})$$ is in Knuth's Art of Computer Programming, Vol. 3, Exercise 5.2.2-4, as part of the analysis of bubblesort.

Putting this all together gives us $S(n) = \frac{1}{2} \log (2n) + \frac{\gamma}{2} + o(1) = \frac{1}{2} \log n + \frac{\gamma + \log 2}{2} + o(1).$

For more on the $Q(n)$ and related functions and their asymptotics, see The Art of Computer Programming, Vol. 1 (3rd ed.), Section 1.2.11.3.

Mike Spivey
  • 55,550
  • I did computer simulation and $S(n)$ indeed goes to $1/2 log(n)$. But I am afraid your proof is a difficult for me to understand. Is there any other simpler proofs? – NonalcoholicBeer Oct 12 '11 at 23:57
  • @ablmf: Unfortunately, I don't know of a simpler one. I see that you have asked for a simpler proof as a separate question, though, so maybe somebody else on the site can think of one. However, if Knuth and Pittel felt the need to go through work like this to solve a similar problem, my guess is that finding a simpler proof won't be easy. – Mike Spivey Oct 13 '11 at 01:42
3

The term $a_k(n) = \frac{1}{k} \prod_{j=1}^{k-1} (1 - j/n)$ satisfies $a_k(n) < 1/k$, so $S(n) < 3/2 + \sum_{k=3}^n 1/k \approx \ln(n)$. On the other hand, take any $p$ with $0 < p < 1/2$. For $k < n^p$ we have $\prod_{j=1}^{k-1} (1 - j/n) > (1 - n^{p-1})^{n^{p}}$, which has limit 1 as $n \to \infty$. So for any $\epsilon > 0$, if $n$ is sufficiently large we have $$S(n) > \sum_{k=3}^{n^{p}} \frac{1-\epsilon}{k} \approx (1-\epsilon) \ln(n^p) = (1-\epsilon) p \ln(n)$$

Robert Israel
  • 448,999