Are there really good upper and lower bounds for $\binom{n}{cn}$ when $c$ is a constant $0 < c < 1$? I know that $\left(\frac{1}{c^{cn}}\right) \leq \binom{n}{cn} \leq \left(\frac{e}{c}\right)^{cn}$.
3 Answers
Using Stirling's formula, $$ \begin{align}{n\choose cn}&\approx\frac{n^n e^{-n}\sqrt{2\pi n}}{(cn)^{cn}e^{-cn}\sqrt{2\pi cn}\cdot ((1-c)n)^{(1-c)n}e^{-(1-c)n}\sqrt{2\pi (1-c)n}}\\&=\frac{1}{c^{cn}(1-c)^{(1-c)n}\sqrt{2\pi c(1-c)n}}\end{align}.$$ You can turn $\approx$ to good upper and lower bounds by fillig in the details of the error term in the Stirling formula.
Edit: For every $n\geqslant1$ and every $c$ in $(0,1)$ such that $cn$ is an integer, $$ \frac{2\pi/\mathrm e^2}{c^{cn}(1-c)^{(1-c)n}\sqrt{2\pi c(1-c)n}}\leqslant{n\choose cn}\leqslant\frac{\mathrm e/\sqrt{2\pi}}{c^{cn}(1-c)^{(1-c)n}\sqrt{2\pi c(1-c)n}}. $$ The ratio between the upper and lower bounds is $\lt1.275$.

- 279,727

- 374,180
-
The page @Hagen linked to explains that the ratio between any factorial and its Stirling equivalent is always between $1$ and $a=\mathrm e/\sqrt{2\pi}\lt1.09$. Since binomial coefficients are ratios of one factorial by two other factorials, multiplying the equivalent by $1/a^2$ and $a$ yields rigorous lower and upper bounds. Better estimates are also on the same WP page (did you read it?). – Did Sep 18 '13 at 11:01
-
The link is in your comment. I did not suggest to use convergent series, only the rigorous bounds, valid for every $n$, which are on the WP page. – Did Sep 18 '13 at 11:22
-
@Hagen I took the liberty to edit your answer, adding an expansion of the hint you give at the end, since this seems to be what the OP is after. Of course, if you object, please revert to the previous version (and accept my apologies). – Did Sep 18 '13 at 11:39
-
@Anush You are welcome. Whether the absolute bounds in the Edit or the moving bounds based on the $\exp(1+1/(12n))$ factor explained on the WP page are best, really depends on the setting. – Did Sep 18 '13 at 13:56
-
@Did: I don't any link in Hagen's answer. I assume you refer to the link in your comment. – robjohn Sep 18 '13 at 22:42
-
@robjohn This phrase is answering a now deleted comment, where the link appeared. As a mod, do you see deleted comments? – Did Sep 19 '13 at 05:25
Stirling's Asymptotic Expansion, derived here, is $$ n!=\sqrt{2\pi n}\,n^ne^{-n}\left(1+\frac1{12n}+\frac1{288n^2}-\frac{139}{51840n^3}-\frac{571}{2488320n^4}+O\left(\frac1{n^5}\right)\right) $$ From which we get $$ \begin{align} &\frac{n!}{(cn)!((1-c)n)!}\\[6pt] &=\frac{\left(c^c(1-c)^{1-c}\right)^{-n}}{\sqrt{2\pi c(1-c)n}}\small\left(1-\frac{1-c+c^2}{12c(1-c)}\frac1n+\frac{1-2c+3c^2-2c^3+c^4}{288c^2(1-c)^2}\frac1{n^2}+O\left(\frac1{n^3}\right)\right) \end{align} $$
Absolute Bounds
For $n\ge1$, $$ \sqrt{2\pi n}\,n^ne^{-n}\left(1+\frac1{12n}\right)\le n!\le\sqrt{2\pi n}\,n^ne^{-n}\left(1+\frac1{12n}+\frac1{288n^2}\right) $$ which gives $$ \sqrt{2\pi n}\,n^ne^{-n}\le n!\le\frac{313}{288}\sqrt{2\pi n}\,n^ne^{-n} $$ From which we get, for $1\le cn\le n-1$, $$ \left(\frac{288}{313}\right)^2\frac{\left(c^c(1-c)^{1-c}\right)^{-n}}{\sqrt{2\pi c(1-c)n}} \le\binom{n}{cn} \le\frac{313}{288}\frac{\left(c^c(1-c)^{1-c}\right)^{-n}}{\sqrt{2\pi c(1-c)n}} $$ Note that $$ \frac{e}{\sqrt{2\pi}}\doteq1.0844\lt1.0868\doteq\frac{313}{288} $$ so this is close to Hagen von Eitzen's answer.
The bound mentioned in Hagen's answer is based on the fact that the ratio $$ \frac{\Gamma(n)}{\sqrt{2\pi n}\,n^ne^{-n}} $$ is decreasing in $n$. Thus, the greatest ratio is for $n=1$ and therefore, $$ \sqrt{2\pi n}\,n^ne^{-n}\le n!\le\frac{e}{\sqrt{2\pi}}\sqrt{2\pi n}\,n^ne^{-n} $$
-
This probably uses the postfactor $1+c/(12n)+\cdots$ instead of $1+1/(12cn)+\cdots$ for $(cn)!$ (and similarly for $((1-c)n)!$). – Did Sep 18 '13 at 11:20
-
-
I love the upvotes... By the way, using asymptotic expansions seems offtopic if the OP is interested in nonasymptotic bounds. – Did Sep 18 '13 at 11:22
-
Thanks for noticing the error. However, as for the asymptotic expansion being off-topic, I did not notice that the OP was not interested in asymptotic bounds. I was simply supplying what "error terms" I could. – robjohn Sep 18 '13 at 11:35
-
-
That looked to be in response to a pretty obscure answer, since $H(c)$ is not defined. I appreciate your comments as to how others feel about my answer. – robjohn Sep 18 '13 at 12:02
-
-
-
@ArashBeh: Do you mean this entropy function? If so, then $$ \begin{align} e^{nH(c)} &=e^{-nc\log_2(c)-n(1-c)\log_2(1-c)}\ &=\left(c^c(1-c)^{1-c}\right)^{-n/\log(2)} \end{align} $$ which is wrong by a factor of $\log(2)$ in the exponent. Supposing that we use $\log_e$ instead of $\log_2$ in the definition of $H$, then there is a missing factor of $\sqrt{2\pi c(1-c)n}$. However, it is interesting that $e^{nH(c)\log(2)}$ is as close as it is for such a simple expression. – robjohn Sep 18 '13 at 12:28
-
@ArashBeh sorry, I was in the process of writing the last comment when you posted your response. Ignore the part before using $\log_e$. – robjohn Sep 18 '13 at 12:30
-
@robjohn, Such bounds can be obtained from an information theoretic perspective and can be sometimes very interesting. – Arash Sep 18 '13 at 12:51
-
"I appreciate your comments as to how others feel about my answer"... What do you mean? I do not understand. – Did Sep 18 '13 at 13:53
-
-
Yes, and? A new example that at least some users do not really look at the answers (but just check it is more or less written in mathematese, and, probably, have a look at the author's name). No big deal. – Did Sep 18 '13 at 15:41
-
@Did: Hagen's answer did not even mention any bound; he only mentioned Stirling's Formula. I see that you added the actual bounds to that answer after my answer (and Hagen had 7 votes). I see no need to belittle my votes in this case. – robjohn Sep 18 '13 at 22:53
-
No. Hagen explicitely mentioned and precisely described, from the start, how to get rigorous bounds using the error term in Stirling's formula, all I did was to write down his directions. Anyway, when I wrote the side remark about upvotes, your answer was wrong and got 2. That makes two reasons why I am surprised to see you nitpick at the remark. A third is that it obviously addressed the general phenomenon of un-informed upvotes on MSE. Believe it or not, I do not care about the upvotes this specific answer receives. (But I do resent being forced to explain such trivialities.) – Did Sep 19 '13 at 05:22
-
@Did: He said "error term" which is not clear. There are several error terms for Stirling's approximation, one of which I incorporated (incorrectly but then corrected) in my answer. Once it was pointed out that the OP clarified what he wanted in a comment, I used my error term to get a bound $\frac14%$ looser than yours. Don't get me wrong, I appreciate the notice that I had used $\frac cn$ instead of $\frac1{cn}$. However, I was suspicious of the formula soon after I posted (it seemed too nice), so I was checking it. – robjohn Sep 19 '13 at 07:00
-
Wrong again, on several counts. (1) "Error term" is crystal clear. (2) The OP "clarified" much earlier than you posted. Yeah, deleted comments. // Enough said. – Did Sep 19 '13 at 07:13
-
@Did: (1) Obviously you think so, but when there are more than one, I don't think so. (2) I didn't say that the OP did not clarify earlier than I posted. I mentioned earlier that I had not read that comment since it "looked to be in response to a pretty obscure answer" – robjohn Sep 19 '13 at 07:44
Hint: as $n$ goes to infinity, you can approximate $\binom{n}{cn}$ by Entropy function as follows: $$ \binom{n}{cn}\approx e^{nH(c)}. $$ where $H(c)=-c\log(c)-(1-c)\log(1-c)$

- 11,131