2

Does the rule of $n ^ a$ dominate $n ^ b$ if $a > b$ apply here as well?

My understanding is that $n \log n$ will be dominated by $n \log ^2 n$ because of $\log$ being raised to the power of $2$.

xskxzr
  • 7,455
  • 5
  • 23
  • 46

2 Answers2

3

little-oh proof

An equivalent but more straightforward question would be why $\lg n$ is dominated by $\lg^2 n$, that is why $\lg n \in o(\lg^2 n)$.

Then based on the definition of little-oh we need to show that for any choice of constant $ c > 0 $, we can find a constant $ n_0 $ such that the inequality $ \lg n < c \lg^2n $ holds for all $ n > n_0 $.

We prove that if we pick $ n_0 = \sqrt[c]{b} $ where $ b $ is the base of $ \lg $, then the definition above holds.

If $ n_0 = \sqrt[c]{b} $ then we have $ n > \sqrt[c]{b} $ or $ n > b^{\frac{1}{c}} $. Now since $\lg n$ with a base $ b > 1 $ is an increasing function, then $ \lg n > \lg b^\frac{1}{c} $ or $ \lg n > \frac{1}{c} $. If we multiply both sides by $ c > 0 $, we have $ c \lg n > 1 $. Now we can multiply both sides by $ \lg n > 0$ (Note that for $ \lg n > 0 $ to be true, we must have $ n_0 \geqslant 1 $ which leads to $ \sqrt[c]{b} \geqslant 1 $ or $ b^\frac{1}{c} \geqslant 1$ which results in $ \lg b^{\frac{1}{c}} \geqslant \lg 1 $ or $ \frac{1}{c} \geqslant 0 $ or $ c > 0 $; which is already guaranteed.) to have $$ c \lg^2 n > \lg n . $$ This is what we needed to show.


Big-Oh proof sketch

A slightly different but related question could be why $\lg n$ is bounded above by $\lg^2 n$, that is why $\lg n \in O(\lg^2 n)$.

The answer is that since $\lg n > 1$ for $n$ larger than the base of $\lg$, then if we raise $\lg n$ to any power greater than $1$ (including power of $2$), it will be larger than $\lg n$ itself.

(We could rather argue that since we proved that $ \lg n \in o( \lg^2 n) $, it follows from $ o(f) \subset O(f) $ that $ \lg n \in O( \lg^2 n) $.)


Note

So yes, the rule of $n^a > n^b$ if $a > b $ and $ n > 1 $ does apply here. Note that this rule is equivalent to the statement that any exponential function with a base greater than $1$ is an increasing function.

Pooya
  • 135
  • 3
1

Assume $O(n \log n) = O(n \log ^2 n)$. This implies that for any function $f(n) \in O(n \log ^2 n)$, there exist a positive constant $k$ and a constant $n'$ such that $f(n) < k (n \log n)$ for all $n \geq n'$.

Take $f(n) = n \log ^2 n$. Clearly $f(n) \in O(n \log ^2 n)$.

So we have that there exists a value for $k$ and $n_0$ such that $n \log ^2 n < k (n \log n)$ for all $n \geq n'$.

Noting that $n \log ^2 n = n \log n \log n$, we arrive at a contradiction: whatever constant value we choose for $k$, there exists a value $n'$ such that $\log n > k$ for all $n \geq n'$.


In simpler terms, were $O(n \log n)$ to be the same as $O(n \log ^2 n)$, it would mean that $n \log ^2 n$ does not grow faster than some constant factor of $n \log n$. But this is obviously not correct as $n \log ^2 n$ grows at a rate of $\log n$ times $n \log n$ (and $\log n$ grows faster than a constant).

badroit
  • 727
  • 4
  • 14