7

I am trying to understand how to prove that a polynomial will always grow faster than a logarithm.

$\log n = o(n^\epsilon), \epsilon>0$

Intuitively, it is obvious, and plugging in a few numbers always yields true, but how can I prove this?

Maybe this can be done inductively (I would prefer this method if someone would explain it), but I attempted to prove through the use of derivatives and L'Hôpital's rule, namely:

$\lim_{n\rightarrow\infty}\frac{\log n}{n^\epsilon} = \lim_{n\rightarrow\infty}\frac{\frac{1}{n}}{\epsilon n^{\epsilon-1}}$ = 0

Is this getting me in the right direction to prove that the upper bound of $\log n$ it is strictly less than $n^\epsilon$?

Yuval Filmus
  • 276,994
  • 27
  • 311
  • 503
Rodman Huey
  • 71
  • 1
  • 2

3 Answers3

3

Here is a simple proof which avoids L'Hôpital's rule.

We start with the observation $$ \log n = \int_1^n \frac{dx}{x} \leq \int_1^n dx \leq n, $$ and so $$ \frac{\log n}{n^2} \leq \frac{1}{n} \xrightarrow{n\to\infty} 0. $$

Given $\epsilon > 0$, using the fact that $\lim_{n\to\infty} n^{\epsilon/2} = \infty$, we see that $$ 0 = \lim_{n\to\infty} \frac{\log (n^{\epsilon/2})}{(n^{\epsilon/2})^2} = \lim_{n\to\infty} \frac{\frac{\epsilon}{2} \log n}{n^\epsilon}. $$ It follows that $$ \lim_{n\to\infty} \frac{\log n}{n^\epsilon} = 0. $$

Yuval Filmus
  • 276,994
  • 27
  • 311
  • 503
1

Let us first prove the probably more common fact, any growing exponential function grows faster than a monomial. (You can just skip this part if it is known to you.)

For any $\alpha>0$, $\beta > 1$, we can choose an integer $c\gt \alpha$. Applying L'Hôpital's rule $c$ times we get

$$0\le\lim_{x\to\infty}\frac{x^\alpha}{\beta^x}\le\lim_{x\to\infty}\frac{x^c}{\beta^x} 0=\lim_{x\to\infty}\frac{cx^{c-1}}{(\log\beta)\beta^x}\\ =\lim_{x\to\infty}\frac{cx^{c-1}}{(\log\beta)\beta^x}=\lim_{x\to\infty}\frac{c(c-1)x^{c-2}}{(\log\beta)^2\beta^x}\\ =\cdots=\lim_{x\to\infty}\frac{c(c-1)\cdots2\cdot1}{(\log\beta)^c\beta^x}=0$$ where the last equality holds since $\beta^x = (1+(1-\beta))^x\gt1+x(1-\beta)\to\infty$ as $x\to\infty$.

So, $x^\alpha = o(\beta^x)$ for any $\alpha>0$, $\beta > 1$.


Let $\alpha = 1, \beta=e^\epsilon$. By a change of variable $x=\log n$, we can see that $$\log n = o(e^{\epsilon\log n})=o(n^\epsilon)$$ Similarly, we have $$\log\log n = o((\log n)^\epsilon) \quad\text{ for any }\epsilon\gt 0 $$ $$e^n = o(\lambda^{e^n}) \quad\text{ for any }\lambda\gt 1 $$

John L.
  • 38,985
  • 4
  • 33
  • 90
0

Your solution is one tiny step away from proving that the limit is zero.

But obviously log n has no upper limit. And if eps is small, it will take a very large n to make n^eps < log n.

I recommend that you use a spreadsheet and try this. If eps = 0.001 then n must be as large as 10^1000 just to make n^eps = 10.

gnasher729
  • 29,996
  • 34
  • 54
  • 1
    I'm confused. What role does a spreadsheet have in proving that $n^{\varepsilon} < \log n$ for all $\varepsilon$ and all large enough $n$? – David Richerby Oct 11 '16 at 13:56
  • 1
    Well, it helps you getting your head out of the clouds. And realising that "large enough" is very, very, very large if eps is small. – gnasher729 Oct 11 '16 at 18:36