It means "open" the recursion.
For simplicity - denote $O(n)$ as $c\cdot n$:
$$ \begin{align}
T(n) &= 2T(n/2) + cn \\
&= 2(2T(n/4) + cn/2) + cn\\
&= 2 (2 (2T(n/8) + cn/4) + cn/2) + cn\\
& \vdots
\end{align}$$
It might give you intuition, but it is NOT a proof. To prove it, you will need mathematical induction or the master theorem.
Proving with induction (assuming O(n)
component is n
for simplicity):
Claim: T(n) <= n*logn + n
Base:
T(1) = 1 (assumption)
Assumption: the claim is correct for all k < n
.
Proof:
T(n) = 2T(n/2) + n = (assumption) <= 2* (n/2 * log(n/2) + n/2) + n
= n*log(n/2) + 2n = n*(log(n)-log(2)) + 2n = (assuming base 2 for log)
= n*(log(n) -1 ) + 2n = nlogn -n + 2n = n*logn +n
QED
If you've learned your algorithmic theory well, it often doesn't take much extra time to deduce the complexities of different solutions (mostly a few seconds in your head). In a CPU-limited program, this can possibly save you from having to start over and find a whole new solution, after the fact.
Of course, you also have to consider whether your data is big enough for the abstractions of the Big O notation to be valid.
– Nov 23 '12 at 09:26