5

Let $O(n)$ be "Big-O" of $n$ and $o(n)$ be "Small-O" of $n$.

It is a well-known fact that $O(n \log{n}) \subset O(n^{1 + \epsilon})$ for any $\epsilon > 0$. Can we omit the $\epsilon$, and just type $O(n \log{n}) \subset O(n^{1 + o(1)})$?

Juho
  • 22,554
  • 7
  • 62
  • 115
  • Are you certain about the meaning of nested Landau symbols? What do you mean by "the tightest bounds" when you give a whole class of bounds? Can there really be a single tightest upper bound (that is not the function itself) when we talk about real functions? – Raphael Oct 18 '13 at 20:57
  • Sure, it has some disambiguation in notations. But I have seen such notation in research papers. For example, popular paper about algorithms of polynomial factorization over finite fields (http://people.csail.mit.edu/dmoshkov/courses/codes/poly-factorization.pdf) has that sort of notations. – Piotr Semenov Oct 18 '13 at 21:06
  • 2
    The question was not whether people use them but whether you are clear about what you/they mean. ;) – Raphael Oct 18 '13 at 21:08

2 Answers2

6

Consider the function $f(n) = n^{1 + \frac{1}{n}}$ which I guess you'd say is in "$O(n^{1 + o(1)})$". I'm not sure you can call it polynomial, though.

Now, compute

$\qquad \displaystyle \lim_{n \to \infty} \frac{n^{1 + \frac{1}{n}}}{n \log n} = 0$,

so you have in fact that $f \in o(n \log n)$. Therefore, functions in this funky class of yours are not even (all) asymptotic upper bounds for $n \log n$.

The difference is that $\varepsilon$ ensures a non-vanishing distance from $n^1$ whereas $o(1)$ means that functions of the form $n^{1 + o(1)}$ in fact converge towards $n \in o(n \log n)$.

Raphael
  • 72,336
  • 29
  • 179
  • 389
  • Thanks a lot! Can you kindly explain, why authors not just say $O(n \log{n})$ instead of $O(n^{1 + o(1)} \log{n})$? (For example, in paper I mentioned above). It is clear that $O(n \log{n}) = O(n^{1 + o(1)} \log{n})$. – Piotr Semenov Oct 18 '13 at 21:17
  • 2
    It may happen as result of a longer derivation; then, $o(1)$ is often used as error term, implying that the authors give all contributions of $\Omega(1)$ exactly. For instance, the paper contains $n^{1 + \beta + o(1)}$; saying just $n^{1 + \beta}$ immediately would hide the fact that additional factors had to be shown insignificant; note how different $n^{1 + \beta + O(1)}$ would be! (It's still abuse of notation, but well.) – Raphael Oct 18 '13 at 21:44
  • I don't get the part "$n^n \in \omega(e^n)$, so $\sqrt[n]{n} \in \mathcal{o}(\log n)$", could you elaborate on that? – G. Bach Oct 19 '13 at 01:06
  • @Raphael Thanks a lot! Now it makes sense for me. – Piotr Semenov Oct 19 '13 at 07:20
  • 1
    @PiotrSemenov I just remember another use of such notation: it indicates convergence speed. Compare, for instance, two cases of $f \sim n$: you could show $f = n + \Theta(\log n)$ or $f = n + o(1)$. Both make the same statement about the limit, but different ones about convergence speed. – Raphael Oct 19 '13 at 16:27
  • @G.Bach My line of thought was $f \in \omega(g) \implies f^{-1} \in o(g^{-1})$, but of course $\sqrt[n]{n}$ is not the inverse of $n^n$. Oops. – Raphael Oct 19 '13 at 16:34
  • Huh, never seen that implication. Doesn't seem hard to prove though. – G. Bach Oct 19 '13 at 17:23
  • @G.Bach: I don't remember seeing it in a textbook with a proof, but it seems to be correct intuitively (hence my formulation then). – Raphael Oct 19 '13 at 17:31
6

Common usage would have $n\log n = n^{1 + o(1)}$. This is a shortcut for $n\log n = n^{1 + f(n)}$ for some function $f(n)$ satisfying $f(n) = o(1)$. In our case, this function is $f(n) = \log \log n / \log n$, which indeed satisfies $f(n) = o(1)$.

Yuval Filmus
  • 276,994
  • 27
  • 311
  • 503
  • It is a good point! As for me, notation like $n^{1 + o(1)}$ is now supposed to be bad because $\frac1{n} \in o(1)$ and $\frac{\log{\log{n}}}{\log{n}} \in o(1)$. Only latter one is proper. So we have no chance to "unzip" shortcut $o(1)$ while having no extra information from the author. I do not like this notation for the complexity analysis needs :) Use it only for convergence analysis as Raphael have mentioned above. – Piotr Semenov Oct 20 '13 at 10:23