Why is big O most commonly associated with worst and average case complexties of a function
-
As opposed to what? – Jeremy May 11 '11 at 20:03
-
4Because that's the definition of it? – FrustratedWithFormsDesigner May 11 '11 at 20:05
-
O(1), O(N), O(N^2), O(N!), O(log N) ? – Independent May 11 '11 at 20:09
-
as opposed to best worst case analysis – user23871 May 11 '11 at 20:21
-
1@user23871: Did you mean as opposed to other notations? http://en.wikipedia.org/wiki/Big_O_notation#Related_asymptotic_notations – FrustratedWithFormsDesigner May 11 '11 at 20:37
-
Because that's what most people are interested in? – David Thornley May 11 '11 at 21:20
-
@Jonas: I wouldn't call O(N!) common--that usually means "Run away, screaming". – Loren Pechtel May 12 '11 at 05:20
-
@Loren :) I have to citate a user on Stack i readed onced. "O(n!) means you fail at programming and should perhaps try your hand at a different career. – BlairHippo" – Independent May 12 '11 at 06:01
5 Answers
In mathematics, Big O is the notation for an asymptotic upper bound. The Big-O function (with the unspecified constants filled in with appropriate values) is always at least as large as the "real" function.
There are also notations for the asymptotic lower bound, and for a tight bound. In the tight bound case, with different selections of constants, the same asymptotic formula is both an upper bound and a lower bound of the "real" function. This is known as being "within a constant factor".
The complete set of asymptotic notations is listed here...
http://en.wikipedia.org/wiki/Big_O_notation#Family_of_Bachmann.E2.80.93Landau_notations
The upper bound of a function naturally matches the worst-case memory or time requirements, whereas the lower bound of the worst-case performance function tends to give lots of better-than-worst-case results (it's a kind of best-cases-of-the-worst-cases mismatch). Similarly, the lower bound naturally matches the best-case memory or time requirements, though for algorithms we're less often interested in that.
Some algorithms text books (notably Cormen et al.) use tight bounds a lot.
It turns out that the lower bound of the worst case can be useful, though. I asked this question recently...
Why would I care about the asymptotic growth of the lower bound of the worst case time/space?
In mathematics, computer science, and related fields, big-O notation describes the limiting behavior of a function when the argument tends towards a particular value or infinity, usually in terms of simpler functions.
That is the name they gave the process of finding that value. You can read more about it on the wiki

- 7,833
- 1
- 31
- 38
Mostly because worst case and average case are the ones most people care about -- what can I really expect, and what's the worst that could happen?
The best case does have some real uses though -- for example, I've seen people waste a lot of time trying to find significantly better algorithms in cases that if they'd spent 10 minutes thinking about the best case, they'd have realized that what they had was already (at least from an asymptotic complexity perspective) as good as they could hope for.

- 44,495
Because it is much much much more common to care about the worst case than the best case.
Generally, when you're thinking about the complexity of your algorithm, you're not in a situation where you can make too many assumptions about the input. Therefore, you will have to consider the average case (to estimate performance over large numbers of runs with arbitrary input) or the worst case (to estimate just how long it might take to process arbitrary input). Best case? It's rare to care that your program might run really quickly if you're lucky.
Of course, one can imagine different situations. For example, the best case for the infamous Bubblesort is O(N) if the list was already sorted. If you have a list which you know will be almost sorted (e.g., a previously sorted list that has had some small changes made to some elements), you can be confident that Bubblesorting it will be close to best case performance. But you would never use a Bubblesort for a general sort that was going to take an arbitrary number of arbitrarily shuffled elements.

- 10,500