2

So I couldn't figure this one out, what do i do when i have multiply logs as the running time of an algorithm.

For example:

f(n): loglogn g(n): logloglogn

Does f = Omega(g) in this case? as in g is a much slower growing algorithm than f?

1 Answers1

2

The starting point is the limit $$ \lim_{n\to\infty} \frac{\log n}{n} = 0, $$ which can also be stated succinctly as $\log n = o(n)$. If $f(n)$ is any function such that $\lim_{n\to\infty} f(n) = \infty$, it then follows that $$ \lim_{n\to\infty} \frac{\log f(n)}{f(n)} = 0, $$ or more succinctly, $\log f(n) = o(f(n))$. In particular, taking $f(n) = \log n$ we get $\log\log n = o(\log n)$, taking $f(n) = \log\log n$ we get $\log\log\log n = o(\log\log n)$, and so on.

More generally, if $a(n) = o(b(n))$ and $f(n) \to \infty$ then $a(f(n)) = o(b(f(n))$. (The case above was $a(n) = \log n$ and $b(n) = n$.) There are similar rules for big O, which I leave for the reader to discover.

Yuval Filmus
  • 276,994
  • 27
  • 311
  • 503