15

What does $\log^{O(1)}n$ mean?

I am aware of big-O notation, but this notation makes no sense to me. I can't find anything about it either, because there is no way a search engine interprets this correctly.

For a bit of context, the sentence where I found it reads "[...] we call a function [efficient] if it uses $O(\log n)$ space and at most time $\log^{O(1)}n$ per item."

Raphael
  • 72,336
  • 29
  • 179
  • 389
Oebele
  • 253
  • 1
  • 4
  • 1
    I agree that one should not write things like this, unless one is very clear about what it is to mean (and tells the reader what that is) and uses it the same rules consistently. – Raphael Oct 21 '15 at 16:35
  • 1
    Yes, one should instead write it as $: (\log(n))^{O(1)} ;$. $;;;;$ –  Oct 21 '15 at 17:16
  • 1
    @RickyDemer That's not the point that Raphael is making. $\log^{\mathrm{blah}}n$ means exactly $(\log n)^{\mathrm{blah}}$. – David Richerby Oct 21 '15 at 19:35
  • 4
    @Raphael This is standard notation in the field. Anyone in the know would know what it means. – Yuval Filmus Oct 21 '15 at 19:45
  • @YuvalFilmus And yet we get question after question about these things, and one (arguably) wrong answer. Yes, the syntax is standard, but not the understanding of what it is to mean. – Raphael Oct 22 '15 at 07:27
  • 2
    @YuvalFilmus I think the variety of disagreeing answers is conclusive proof that your claim is false, and that one should indeed refrain from using such notation. – Raphael Oct 27 '15 at 09:07
  • @Raphael Not at all. It just means that in different areas it has different meanings. In algorithms and computational complexity its meaning is fixed. There are more ambiguous notations that are commonly used, for example $\tilde{O}$. It all depends on the context. – Yuval Filmus Oct 27 '15 at 09:13

4 Answers4

16

You need to ignore for a moment the strong feeling that the "$O$" is in the wrong place and plough on with the definition regardless. $f(n) = \log^{O(1)}n$ means that there exist constants $k$ and $n_0$ such that, for all $n\geq n_0$, $f(n) \leq \log^{k\cdot 1}n = \log^k n$.

Note that $\log^k n$ means $(\log n)^k$. Functions of the form $\log^{O(1)}n$ are often called polylogarithmic and you might hear people say, "$f$ is polylog $n$."

You'll notice that it's easy to prove that $2n=O(n)$, since $2n\leq k n$ for all $n\geq 0$, where $k=2$. You might be wondering if $2\log n = \log^{O(1)}n$. The answer is yes since, for large enough $n$, $\log n\geq 2$, so $2\log n \leq \log^2n$ for large enough $n$.

On a related note, you'll often see polynomials written as $n^{O(1)}$: same idea.

David Richerby
  • 81,689
  • 26
  • 141
  • 235
  • This is not supported by the common placeholder convention. – Raphael Oct 21 '15 at 16:29
  • I retract my comment: you write $\leq$ in all the important places, which is sufficient. – Raphael Oct 22 '15 at 07:27
  • @Raphael OK. I hadn't had time to check it yet but my feeling was you might be ordering quantifiers differently from the way I am. I'm not actually sure we're defining the same class of functions. – David Richerby Oct 22 '15 at 07:34
  • I think you are defining my (2), and Tom defines $\bigcup_{c \in \mathbb{R}_{>0}} { \log^c n }$. – Raphael Oct 22 '15 at 07:39
9

This is an abuse of notation that can be made sense of by the generally accepted placeholder convention: whenever you find a Landau term $O(f)$, replace it (in your mind, or on the paper) by an arbitrary function $g \in O(f)$.

So if you find

$\qquad f(n) = \log^{O(1)} n$

you are to read

$\qquad f(n) = \log^{g(n)} n$ for some $g \in O(1). \hspace{5cm} (1)$

Note the difference from saying "$\log$ to the power of some constant": $g = n \mapsto 1/n$ is a distinct possibility.

Warning: The author may be employing even more abuse of notation and want you to read

$\qquad f(n) \in O(\log^{g(n)} n)$ for some $g \in O(1). \hspace{4.3cm} (2)$

Note the difference between (1) and (2); while it works out to define the same set of positive-valued functions here, this does not always work. Do not move $O$ around in expressions without care!

Gilles 'SO- stop being evil'
  • 43,613
  • 8
  • 118
  • 182
Raphael
  • 72,336
  • 29
  • 179
  • 389
  • 3
    I think what makes it tick is that $x \mapsto \log^x(n)$ is monotonic and sufficiently surjective for each fixed $n$. Monotonic makes the position of the $O$ equivalent and gives you (2) ⇒ (1); going the other way requires $g$ to exist which could fail if $f(n)$ is outside the range of the function. If you want to point out that moving $O$ around is dangerous and doesn't cover “wild” functions, fine, but in this specific case it's ok for the kind of functions that represent costs. – Gilles 'SO- stop being evil' Oct 24 '15 at 02:24
  • @Gilles I weakened the statement to a general warning. – Raphael Oct 25 '15 at 19:11
  • 1
    This answer has been heavily edited, and now I am confused: do you now claim that (1) and (2) are effectively the same? – Oebele Oct 26 '15 at 09:54
  • @Oebele As far as I can tell, they are not in general, but here. – Raphael Oct 26 '15 at 10:15
  • But, something like $3 log^2 n$ does not match (1) but does match (2) right? or am I just being silly now? – Oebele Oct 26 '15 at 14:33
  • @Oebele: If $g(n) = 2 + \log(3)/\log(\log(n)) \in O(1)$, then $\log(n)^{g(n)} = 3 \log(n)^2$. The edge case is when a function is too small to be $\log(n)^{O(1)}$, such as $\log(n)^{-n}$. –  Oct 26 '15 at 17:12
  • I repeat my comment that the author is not employing "even more abuse of notation", because the relevant quote explicitly prefixed it with "at most". –  Oct 26 '15 at 17:16
  • @Hurkyl 1) If there are functions "too small" to be in $O(_)$, something is horribly broken. 2) You can repeat that comment, but that does not make it more valid. – Raphael Oct 27 '15 at 08:59
  • Nobody said anything about something being too small to be in $O(_)$. If that was meant include more general uses of the notation, a more obvious example is that if $n$ is not too small to be in $n^2 + O(n)$, then something is horribly broken. –  Oct 27 '15 at 09:42
  • @Hurkyl 1) "when a function is too small to be $log(n)^{O(1)}$" -- your words, not mine. 2) $n^2 + O(n)$ is an entirely different beast, at least the way I have seen it introduced rigorously. In particular, many functions are $O(n^2)$ without being $n^2 + O(n)$. This has nothing to do with how to read a one-term $O$, which is the topic here. – Raphael Oct 27 '15 at 11:01
6

It means that the function grows at most as $\log $ to the power of some constant, i.e. $\log^2(n)$ or $\log^5(n)$ or $\log^{99999}(n)$...

Gilles 'SO- stop being evil'
  • 43,613
  • 8
  • 118
  • 182
Tom van der Zanden
  • 13,238
  • 1
  • 35
  • 54
  • This can be used when the function growth is known to be bounded by some constant power of the $\log$, but the particular constant is unknown or left unspecified. –  Oct 21 '15 at 13:47
  • This is not supported by the common placeholder convention. – Raphael Oct 21 '15 at 16:29
2

"At most $\log^{O(1)} n$" means that there is a constant $c$ such that what is being measured is $O(\log^c n)$.

In a more general context, $f(n) \in \log^{O(1)} n$ is equivalent to the statement that there exists (possibly negative) constants $a$ and $b$ such that $f(n) \in O(\log^a n)$ and $f(n) \in \Omega(\log^b n)$.

It is easy to overlook the $\Omega(\log^b n)$ lower bound. In a setting where that would matter (which would be very uncommon if you're exclusively interested in studying asymptotic growth), you shouldn't have complete confidence that the author actually meant the lower bound, and would have to rely on the context to make sure.


The literal meaning of the notation $\log^{O(1)} n$ is doing arithmetic on the family of functions, resulting in the family of all functions $\log^{g(n)} n$, where $g(n) \in O(1)$. This works in pretty much the same as how multiplying $O(g(n))$ by $h(n)$ results in $O(g(n) h(n))$, except that you get a result that isn't expressed so simply.


Since the details of the lower bound are in probably unfamiliar territory, it's worth looking at some counterexamples. Recall that any $g(n) \in O(1)$ is bounded in magnitude; that there is a constant $c$ such that for all sufficiently large $n$, $|g(n)| < c$.

When looking at asymptotic growth, usually only the upper bound $g(n) < c$ matters, since, e.g., you already know the function is positive. However, in full generality you have to pay attention to the lower bound $g(n) > -c$.

This means, contrary to more typical uses of big-oh notation, functions that decrease too rapidly can fail to be in $\log^{O(1)} n$; for example, $$ \frac{1}{n} = \log^{-(\log n) / (\log \log n)} n \notin \log^{O(1)} n$$ because $$ -\frac{\log n}{\log \log n} \notin O(1) $$ The exponent here grows in magnitude too rapidly to be bounded by $O(1)$.

A counterexample of a somewhat different sort is that $-1 \notin \log^{O(1)} n$.

  • Can't I just take $b=0$ and make your claimed lower bound go away? – David Richerby Oct 26 '15 at 22:02
  • 1
    @DavidRicherby No, $b=0$ still says that $f$ is bounded below. Hurkyl: why isn't $f(n) = 1/n$ in $\log^{O(1)} n$? – Gilles 'SO- stop being evil' Oct 27 '15 at 00:27
  • @Gilles: More content added! –  Oct 27 '15 at 00:57
  • @Gilles OK, sure, it's bounded below by 1. Which is no bound at all for "most" applications of Landau notation in CS. – David Richerby Oct 27 '15 at 07:51
  • Your "move around $O$" rule does not always work, and I don't think "at most" usually has that meaning; it's just redundant. 2) Never does $O$ imply a lower bound. That's when you use $\Theta$. 3) If and how negative functions are dealt with by a given definition of $O$ (even without abuse of notation) is not universally clear. Most definitions (in analysis of algorithms) exclude them. You seem to assume a definition that bounds the absolute value, which is fine.
  • – Raphael Oct 27 '15 at 09:00
  • [ctd] I don't think it's to be used like you do when $O(_)$ is used inside a larger term, though. One more reason to not do it, to avoid confusion. – Raphael Oct 27 '15 at 09:03
  • @Raphael: (1) There is no "move around $O$" rule involved in my post. I'm pretty sure "at most" means "is less than or equal to the following thing". It's only redundant in the situations where it's redundant. (2) $O$ always implies a lower bound on the value, by virtue of implying an upper bound on magnitude. It may be redundant in situations you personally care about and are willing to use the notation, but that doesn't mean it's not there. $\Theta$ implies something entirely different. –  Oct 27 '15 at 09:44
  • @Raphael: And (3) I'm pretty sure your concern is entirely due to lack of familiarity with things appearing in the exponent: I would be extremely surprised if you maintained your objection if we were to consider things like $n^2 + O(n)$ or $\sqrt{n + O(1)}$ instead. –  Oct 27 '15 at 10:04
  • (1) "less than or equal to" a set of functions does not make sense. (2) That's only true for some definitions. And never are bounds like you state in your second paragraphs implied or meant, as far as I know. (3) As stated above, using $O$ as an "error" term is another use case entirely. (4) No, I object to every careless use of Landau notation, on principle. The main reasons are that a) everybody means different things, b) nobody ever gives formal definitions for these abuses, and c) novice readers get confused. – Raphael Oct 27 '15 at 11:05
  • @Raphael: (1) It seems pretty clear to me that there is only one reasonable interpretation (and one obviously degenerate interpretation that isn't intended). I'll grant maybe I'm not imaginative enough, but with current evidence I'll stick to my belief that it's sufficiently meaningful for reasonable conversation. (2) Don't make the mistake of just "moving the $O$ around"; the equivalence is not merely substituting in the definition of $O$, but is instead a corollary of the bound that $g(n) \in O(1)$ implies $|g(n)| < c$. –  Oct 27 '15 at 11:24
  • (3) No use case is implied here, and even if it was a different use case, it's still the same concept. (and quite frankly, such examples are relevant to the asymptotic analysis of algorithms, which is presumably the "use case" you refer to) (4) I do not use the notation carelessly. –  Oct 27 '15 at 11:26