3

I'm reading Cormen's Introduction to Algorithms 3rd edition, and in examples of Master Method recursion solving Cormen gives two examples

  1. $3T( \frac{n}{4} ) + n\log(n)$
  2. $2T( \frac{n}{2} ) + n\log(n)$

For the first example we have $a=3$ and $b=4$ so $n^{\log_4 (3)}=n^{0.793}$ and Cormen says that if we choose $\epsilon = 0.207$ then $f(n) = n\log(n) = \Omega(n^{\log_4(3) + \epsilon})$

How? As I understand it if $\epsilon = 0.207$ then $\Omega(n^{\log_4(3) + \epsilon})= \Omega(n)$ so we have $n\log(n) = \Omega(n)$ but it's not true; But he proves that $n\log(n) = \Omega( n^{\log_4(3) + \epsilon} )$

And then he proves that for the second case $n\log(n)$ does not apply to masters method 3-rd case the same way as I prove above.

So could somebody explain me in detail how the third case of the master's theorem applies to $3T( \frac{n}{4} ) + n \log(n)$ but not to $2T( \frac{n}{2} ) + n\log(n)$.

Raphael
  • 72,336
  • 29
  • 179
  • 389

2 Answers2

3

First of all, please check what $O$ and $\Omega$ and the other Landau symbols mean; that should remove some of the confusion.

Note that $n \log n \in \omega(n)$, that is $n \log n$ grows properly faster than $n$, asymptotically. This can be seen by

$\qquad \lim_{n \to \infty} \frac{n \log n}{n} = \lim_{n \to \infty} \log n = \infty$.

By the same reasonining, though, $n \log n \in o(n^{1 + \varepsilon})$ for every $\varepsilon \in (0,\infty)$.

Therefore, $n \log n \in \Omega(n^\alpha)$ for all $\alpha \in [0,1]$, so in the example you can choose $\varepsilon$ arbitrarily so that $\varepsilon + \log_4(3) \leq 1$. The authors pick one, namely the largest (up to dropped decimal places).

As has been noted, that does not conclude the proof that case three applies; you still have to check that $a f\left( \frac{n}{b} \right) \le c f(n)$ for some $c>1$ and all $n>n_0$, $n_0$ some natural number.

As for the second example, my above explanation shows that there is no $\varepsilon > 0$ so that $n \log n \in \Omega(n^{\log_2(2) + \varepsilon})$, so case three can not apply here. Note that $o(f) \cap \Omega(f) = \emptyset$ for all $f$.

Raphael
  • 72,336
  • 29
  • 179
  • 389
  • Thanks for the answer. I understand all except how you get $n \log n \in \Omega(n^\alpha)$ equation. If it is possible could you explain it? – Vahagn Babajanyan Feb 02 '13 at 19:56
  • 1
    @VahagnBabajanyan: It's a corollary from $n \log n \in \omega(n)$, using only the definitions of $\omega$ and $\Omega$ as well as the (almost trivial) fact that $n^\alpha \in o(n^\beta)$ for all $\alpha < \beta$. – Raphael Feb 02 '13 at 20:12
  • I mean for $\Omega$, $\lim_{n \to \infty} \frac{n \log n}{n^{\alpha}} \neq \infty$. Because for $\omega$ it should be infinity, but for $\Omega$ it shouldn't, or i'm mistaken? – Vahagn Babajanyan Feb 02 '13 at 20:18
  • I mean how can we proove that it is tight bound? – Vahagn Babajanyan Feb 02 '13 at 20:45
  • 1
    Yes, you are mistaken. $\omega(f) \subseteq \Omega(f)$. I don't get what you mean by "tight bound", and what "it" refers to. – Raphael Feb 02 '13 at 21:24
  • Cormen says. The asymptotic upper bound provided by O-notation may or may not be asymptotically tight. The bound $2n^{2} = O(n^{2})$ is asymptotically tight, but the bound $2n = O(n^{2})$ is not. We use o-notation to denote an upper bound that is not asymptotically tight. That is infinity lim. I thought that Big O and Omega should be always asymptoticaly tight, so lim should not be equal to infinity, and only now i understand what he means. Thanks a lot. You help me to understand it. – Vahagn Babajanyan Feb 02 '13 at 23:27
1

I think you used that method wrong! As mentioned in master theorem case 3. In your case you should use case 3 which uses $\Omega$-notation which describes asymptotic lower bound. So the solution in the book is correct!

You used $O$-notations which is and asymptotic upper bound! Be careful about which cases in master theorem you are using!

To satisfy the second case you should also have $af(n/b) \leq cf(n)$ for some $c < 1$ and large $n$. In your case, $f(n) = n \log(n)$. $a=3$, $b=4$ you should show that

$\qquad 3(n/4 \cdot \log(n/4)) \leq c n \log(n)$.

For large $n$ you could select $1 \geq c \geq 0.75$ which solves your equation.

Reza
  • 2,258
  • 16
  • 17
  • thnx,i didn't notice that it is Big_Omega instead of Big_O, but again how we can proove that for nlog(n) we can find such epsilon that $nlog(n) = \Omega(n^{log_4(3) + ϵ})$ cause when we put 0.2 instead of epsilon $\Omega(n^{log_4(3)+\epsilon} ) = \Omega(n^{0.793+0.2} ) = \Omega(n^1)$ so we need to show that $nlog(n) = Omega(n^1)$. => we need to find ssuch $c$ and $k$ that for all $n >= k$ $0 <= c * n <= n * log(n)$ that means $0 <= c <= log(n)$ but for Big_Omega we need also lim( c / log(n) ) != 0 but it istn't right, there is no such $c$. I know that book is right, where i did mistake – Vahagn Babajanyan Feb 01 '13 at 18:59
  • @VahagnBabajanyan: please see the completed answer. – Reza Feb 01 '13 at 19:44