How is an algorithm with complexity $O(n \log n)$ also in $O(n^2)$? I'm not sure exactly what its saying here, I feel it may be something to do with the fact that big-oh is saying less than or equal to, but I am not fully sure. Any have any ideas? Thanks.
-
Check the definition of $O$, or this question. – Raphael Jan 12 '13 at 16:36
-
This might be relevant too. – Juho Jan 15 '13 at 01:32
1 Answers
The $O(\cdot)$ notation only gives an upper bound on the complexity. An algorithm has running time $O(n^2)$ if its running time can be bounded by $cn^2$ for some $c$. If it has running time $n$, say, then it's certainly bounded by $cn^2$ and so $O(n^2)$, though that's not the optimal bound.
A complement to $O(\cdot)$ is $\Theta(\cdot)$. An algorithm has running time $\Theta(n^2)$ if its (worst-case) running time is between $c_1n^2$ and $c_2n^2$ for some $0<c_1\leq c_2$. If an algorithm is $\Theta(n\log n)$ then it is not $\Theta(n^2)$ (though still $O(n^2)$). In fact, even if all you know is that it is $O(n\log n)$, then that rules out $\Theta(n^2)$.
Summarizing, $\Theta$ is "exact" complexity (up to constant multiples), and $O$ is just an upper bound. There is also $\Omega$ which is for lower bounds, for example an algorithm is $\Omega(n^2)$ if its (worst-case) running time is at least $cn^2$ for some $c > 0$.

- 276,994
- 27
- 311
- 503