1

I'm practicing complexity analysis for a problem and came across:

$$\sum_{i = 1}^{n} {\log(i!)}$$

I know that $\log(i!) = \Theta(i\log(i))$ so I further simplified to the sum to:

$$\sum_{i = 1}^{n} {i\log(i)}$$

Breaking this out gives me:

$$1\log(1) + 2\log(2) + 3\log(3) ... + n\log(n)$$

Since we have $1 + 2 + 3 + ... + n$ terms I believe there will be $\frac{n(n+1)}{2}$ terms, thus simplifying the sum to:

$$\frac{n(n+1)}{2}\sum_{i = 1}^{n} {\log(i)}$$

and from my research $$\sum_{i = 1}^{n} {\log(i)} = \log(n!)$$

this giving me the sum total:

$$\frac{n(n+1)}{2}\log(n!) = O(n^2\log(n!))$$

However given my other question on SO it appears this calculation is incorrect?

Could someone clear this up?

  • @AOrtiz my simplification logic comes from this post http://stackoverflow.com/questions/2095395/is-logn-%CE%98n-logn – Dominic Farolino Aug 23 '16 at 19:05
  • Which makes me think that sum log(i!) = sum i*log(i) or at least is asymptotically equiv – Dominic Farolino Aug 23 '16 at 19:05
  • 3
    $\sum i \log i$ is NOT equal to $\sum i \cdot \sum \log i$. – Alex M. Aug 23 '16 at 19:43
  • @Dom Farolino: re your original approximation. See the footnote at the end of my answer (edited to add it). – Clement C. Aug 23 '16 at 19:44
  • @AlexM. yeah for some reason I was thinking it was...then I proved that it wasn't by showing $\sum_{i=1}^{n} {i} \cdot \sum_{i=1}^{n} {i} \neq \frac{n(n+1)(2n+1)}{2}$ – Dominic Farolino Aug 23 '16 at 21:56
  • 1
    https://en.wikipedia.org/wiki/Barnes_G-function $$\begin{align} \log G(z+1) &= \frac{1}{12}-\log A+\frac{z}{2}\log 2\pi+\left(\frac{z^2}{2} -\frac{1}{12}\right)\log z\ &\quad-\frac{3z^2}{4}+\sum_{k=1}^{N}\frac{B_{2k + 2}}{4k\left(k + 1\right)z^{2k}}~+~O\left(\frac{1}{z^{2N + 2}}\right).\end{align}$$ – Count Iblis Aug 24 '16 at 00:18

3 Answers3

4

Your reasoning goes awry when you say "I believe there will be $\frac{n(n+1)}{2}$ terms." (actually even before, as we will see.) You only have $n$ terms, and cannot take the number "outside" the sum.

The $i$th term is $i\log i$, it is not $\log i$ or anything else. At that point, there is no "simplifying and taking outside the sum" to do — you have to deal with the summands as they are.


Now, for the original question, and more on your approximation: you would have to justify that. Namely, it is very handwavy, and it's better to make it precise.

You have $$ \sum_{i=1}^n \ln(i!) = \sum_{i=1}^n \sum_{j=1}^i \ln j = \sum_{j=1}^n \sum_{i=j}^n \ln j = \sum_{j=1}^n (n-j+1) \ln j = (n+1)\sum_{j=1}^n \ln j - \sum_{j=1}^n j\ln j \tag{1} $$ which is not a first glance the same thing...$^{(\dagger)}$ Let's deal with each term separately.

  • As we know (see for instance this other question that $ \sum\limits_{j=1}^n \ln j = n \ln n + O(n) $, the first term is $$ (n+1)\sum_{j=1}^n \ln j = n^2\ln n + O(n). \tag{2} $$

  • By a comparison series/integral (since $f$ defined by $f(x) = x\ln x$ is monotone, and nice; see again this other question for more details on the method if you are not familiar with it), it is not hard to show that $$ \sum_{j=1}^n j \ln j = \int_1^n x\ln x dx + O(n) = \frac{1}{2}n^2 \ln n + O(n^2). \tag{3} $$

Combining (2) and (3) into (1), we get $$ \sum_{i=1}^n \ln(i!) = n^2\ln n - \frac{1}{2}n^2 \ln n + O(n^2) = \frac{1}{2}n^2 \ln n + O(n^2). $$


$(\dagger)$ Actually, your original approximation is true, because of the theorem below. However, you may want to cite, or justify it before making such approximations — if you do not know how or why they hold, there is a very high probability you'll end up making a mistake very often.

Theorem. Let $(a_n)_n, (b_n)_n$ be two non-negative sequences such that $a_n\sim_{n\to\infty} b_n$. Then the series $\sum\limits_n a_n$ converges iff the series $\sum\limits_n b_n$ converges; moreover:

  • If the series converge, then $\sum\limits_{n=N}^\infty a_n \sim_{N\to\infty} \sum\limits_{n=N}^\infty b_n$ (the remainders are equivalent);
  • If the series diverge, then $\sum\limits_{n=1}^N a_n \sim_{N\to\infty} \sum\limits_{n=1}^N b_n$ (the partial sums are equivalent).
Clement C.
  • 67,323
  • Do you happen to know where I could find a proof for the theorem at the end of your answer? Hopefully it would help me to solve https://math.stackexchange.com/q/2603506/24547. – Julius Jan 13 '18 at 14:08
  • 1
    @Julius It's a rather standard theorem; off the top of my head and early in the morning, I could find a proof in French (see proof of Theorem 6 here), I'm sure finding a proof in English or parsing this one shouldn't be too hard. – Clement C. Jan 13 '18 at 15:23
2

Your estimate is wrong since there is no reason to go from $$\sum_{i=1}^n i \log i$$ to $$\frac{n(n+1)}{2}\sum_{i=1}^n \log i$$ these two functions are not asymptotically equivalent. A good idea is the following: approximate the sum with an integral $$\sum_{i=1}^n i \log i = \int_1^n x \log x \ \mathrm{d}x + O(n \log n)$$ The integral can be computed exactly by integrating by parts $$\int_1^n x \log x \ \mathrm{d}x = \frac{1}{2} n^2 \log n -\frac{1}{4} (n^2-1) $$ so that the final estimate is $$\sum_{i=1}^n i \log i \sim \frac{1}{2} n^2 \log n = O( n^2 \log n)$$

Crostul
  • 36,738
  • 4
  • 36
  • 72
  • I wouldn't never guessed to do an integral on my own but I guess I don't have enough experience with these yet...I believe you in that I underwent an illegal transformation, as two functions are not asymptotically equivalent but I'm trying to figure out why exactly they are not. It seems there would be (n(n+1))/2 total terms, can I not pull out "i" to (n(n+1))/2 because not all of the terms are equivalent (ex. Some are log(3), some are log(4), some are log(n-1) and so on?) – Dominic Farolino Aug 23 '16 at 19:32
  • 1
    @DomFarolino You are basically trying to write $\sum_i a_i b_i = \sum_i a_i \sum_i b_i$. This is not true in general (or even most of the time)! Look at one extreme example: $$n = \sum_{i=1}^n 1 = \sum_{i=1}^n \frac{i}{i} \neq \sum_{i=1}^n \frac{1}{i}\sum_{i=1}^n i = H_n\frac{n(n+1)}{2} \sim_{n\to\infty} \frac{1}{2}n^2\ln n.$$ – Clement C. Aug 23 '16 at 19:49
1

$$\sum_{i=1}^{n}\log(i!)=\sum_{i=1}^{n}\sum_{j=1}^{i}\log(j) = \sum_{j=1}^{n}(n+1-j)\log(j)=\sum_{k=1}^{n}k\log(n+1-k) \tag{1}$$ is straightforward to study through Abel's summation formula:

$$\sum_{k=1}^{n}k\log(n+1-k) = \int_{1}^{n}\frac{\lfloor x\rfloor(\lfloor x\rfloor+1)}{2(n+1-x)}\,dx \tag{2} $$ from which $$\boxed{ \sum_{i=1}^{n}\log(i!) = \frac{n^2}{2}\log(n)-\frac{3n^2}{4}+O(n\log n)}\tag{3}$$ easily follows.

Jack D'Aurizio
  • 353,855