I can not find the appropriate variable to change the second part $n^2\mathrm{log}^2n$.
-
1Your question may be that you do not know which case of the master theorem to apply here. Is it? – Thinh D. Nguyen Oct 10 '18 at 15:25
-
No I wanna solve it with the substitution method. I can't find a good guess. @ThinhD.Nguyen – siaVash Oct 10 '18 at 15:29
-
The master theorem is the substitution method. It tells you what result you would get. – Yuval Filmus Oct 10 '18 at 15:31
-
Are U sure master theorem work for it? @YuvalFilmus – siaVash Oct 10 '18 at 15:34
-
The Wikipedia version does. – Yuval Filmus Oct 10 '18 at 15:38
-
This will help you https://cs.stackexchange.com/questions/97439/masters-theorem/97443#97443 – Navjot Singh Oct 10 '18 at 18:54
-
what is the initial condition? – kelalaka Oct 11 '18 at 12:02
3 Answers
You want to use the Master theorem. Writing $T(n)=aT(n/b)+f(n)$, we get that $a=4$, $b=2$, and $f(n)=n^2\log^2n$. This gives $\log_b(a)=2$. This is the same exponent on $f(n)$, but you have an extra $\log^2n$. This fits case 2 on the Wikipedia page. So we get $O(n^2)$ with an extra $\log^3n$. This gives an overall result of $T(n)=O(n^2\log^3n)$.

- 350
- 1
- 6
-
-
No, but there should be a similar version. I just don't have the book to reference, so I had to reference something else. It will be similar – HackerBoss Oct 12 '18 at 02:35
$T(n)= a T(n/b)+ f(n) $
- A generalization that usually works; set $n= b^k$.
With backward substitution method.
$T(n)=4T(n/2)+n^2 log^2 n $
- take $n = 2^k$
\begin{align} T(2^k) & = 4 T(2^{k-1}) + 2^{k} \log^2 2^k \\ & = 4 T(2^{k-1}) + 2^{2k} k^2 \\ table (& \\ T(n^{k-1}) & = 4 T(2^{2}) + 2^{2(k-1)} (k-1)^2 \\ T(n^{k-2}) & = 4 T(2^{3}) + 2^{2(k-2)} (k-2)^2 \\ T(n^{k-3}) & = 4 T(2^{4}) + 2^{2(k-3)} (k-3)^2 \\ ) & \\ T(2^k) & = 4 [4 T(2^{k-2}) + 2^{2(k-1)} (k-1)^2 ] + 2^{2k} k^2 \\ & = 4^2 T(2^{k-2}) + 2^{2k} (k-1)^2 + 2^{2k} k^2 \\ & = 4^2 [ 4 T(2^{k-3}) + 2^{2(k-2)} (k-2)^2] + (2^{2k} (k-1)^2 + 2^{2k} k^2) \\ & = 4^3 T(2^{k-3}) + (2^{2k} (k-2)^2 + 2^{2k} (k-1)^2 + 2^{2k}k^2) \\ & = 4^3 [4 T(2^{k-4}) + 2^{2(k-3)} (k-3)^2] + (2^{k} (k-2)^2 + 2^{k} (k-1)^2 + 2^{k}k^2) \\ & = 4^4 T(2^{k-3}) + (2^{k} (k-3)^2 + 2^{k} (k-2)^2 + 2^{k} (k-1)^2 + 2^{k}k^2) \\ ... &= ... \\ & = 4^{i} T(2^{k-i}) + (2^{k} (k-i)^2 + \cdots + 2^{k} (k-1) + 2^{k} k^2) \\ \text{set } i=k & \\ &= 4^{k} T(0) + (2^{2k} 1^2+ \cdots + 2^{2k} (k-1)^2 + 2^{2k}k^2) \\ & = 2^{2k}( 1^2+ \cdots + (k-1)^2 + k^2) \\ \end{align}
\begin{align} ( 1^2+ \cdots + (k-1)^2 + k^2) &= \sum_{i=1}^{k} i^2 \\ &= \frac{ k (k + 1) (2 k + 1)}{6} \end{align}
put back $2^k = n $, and $k = \log_2 n$
$$T(n) = 2^{2k} \frac{ k (k + 1) (2 k + 1)}{6} \in \mathcal{O}( n^2 \cdot \log^3 n)$$

- 1,161
- 1
- 10
- 19
Just like in the textbook proof of $O(n\mathrm{log}(n))$ complexity for divide-and-conquer quicksort, one only needs to show that the sum of each level never exceed the sum (i.e. the value) at the root.
For simple function like $n$, this is trivial.
For more complicated function like $n^2\mathrm{log}^2(n)$, you need to work.
-
Hint: $n=a+b$ implies $n^2>a^2+b^2$ all positive, the nasty $log^2$'s can be pushed up to be all equal to $log^2(n)$, so one does not need to care about them any more. – Oct 12 '18 at 07:35