2

Given the recurrence $T(n) = T(\sqrt{n}) + \theta(lglgn)$, provide an asymptotically tight bound on it's run time.

My solution was to let $m = 2\sqrt{n}$, which leads to the recurrence $S(m) = S(m/2) + \theta(lg(2lg(m) + lg4)) = S(m/2) + \theta(lglgm)$.

By case 3 of the master rule, this means that $S(m) \in \theta(lglgm)$, because $lglgm \in \Omega(m^{log_2 1})$.

Since $lglgm = lg(lg2 + 0.5lgn) \in \theta(lglgn)$, we have that $T(n) \in \theta(lglgn).$

The provided solution to this problem says that $T(n) \in \theta((lglgn)^2)$, could someone help me find my mistake?

The solution is on page 2 of the below PDF, problem 1-2(d): https://ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-046j-introduction-to-algorithms-sma-5503-fall-2005/assignments/ps1sol.pdf

Raphael
  • 72,336
  • 29
  • 179
  • 389
japata
  • 123
  • 2
  • 2
    What do you mean by "runtime of the recurrence"? The running time needed to evaluate it? Or are we to assume that $T$ is a running-time cost function of some algorithm, specified by this schematic recurrence? – Raphael Jul 10 '17 at 17:25
  • 1
    When you substitute $m = 2\sqrt{n}$, you get $S(m^2/4) = S(m/2) + \cdots$. – Yuval Filmus Jul 10 '17 at 18:19
  • By the way: \Theta and \lg/\log. ;) – Raphael Jul 10 '17 at 18:50

1 Answers1

4

First let's see how we arrive at the solution. Let's try expanding it: $$\begin{align} T(n) & = T(n^{\frac{1}{2}}) + \Theta(\lg \lg n)\\ & = T(n^{\frac{1}{4}}) + \Theta(\lg \lg n^{\frac{1}{2}}) + \Theta( \lg \lg n)\\ & = T(n^{\frac{1}{4}}) + \Theta(\lg 2^{-1} \lg n) + \Theta( \lg \lg n)\\ & = T(n^{\frac{1}{4}}) + \Theta(\lg \lg n - 1) + \Theta( \lg \lg n)\\ & = T(n^{\frac{1}{4}}) + 2 \cdot \Theta( \lg \lg n)\\ & = T(n^{\frac{1}{8}}) + 3 \cdot \Theta( \lg \lg n)\\ & \vdots\\ \end{align}$$ At this point you should see every time we recur we do $\Theta(\lg \lg n)$ work. So if we recurse $k$ times then total time is $\Theta(k \lg \lg n)$. Now we just need to find this $k$.

$k$ will be equal to how many times we can take the square root before resolving to a base case. Let's assume our base case is $2$, and all $n$ are of the form: $$n = 2^{2^k}$$ We then can take the square root of $n$ exactly $k$ times before reaching $2$. This is because every time we take the square root of $2^{2^k}$, we cut the exponent in half, e.g. $\sqrt{2^{2^k}} = 2^{2^{k-1}}$. This clearly results in taking the square root $k$ times, or more formally $\log_2 \log_2 n$ times. Therefore we reach the conclusion that the total time is: $$ T(n) = \Theta((\lg \lg n)^2)$$

With that being said this issue seems to be with your domain transformation. Going from $T(n)$ to $S(m)$, you're essentially saying a function of the form $T(n) = T(\sqrt{n}) + \Theta(f)$ is equivalent to a function of the form $S(m) = S(\frac{m}{2}) + \Theta(f)$ because after the domain transformation you can kinda abuse $\Theta$ notation to remove a lot of non-arbitrary values.

For more on domain transformations to solve recurrences check the notes 5.2 here.

ryan
  • 4,501
  • 1
  • 15
  • 41
  • Thanks for the help! This approach makes perfect sense to me. Is the approach of the provided solution correct? To me, their substitution would imply that $m/2 = lg(n)/2 = \sqrt n$, which is clearly not true. – japata Jul 10 '17 at 18:53
  • Yes their approach is okay. For $m = \lg n$, the implication is not that $S(m) = T(\lg n)$, if that were the case there would be no need to differentiate $S$ from $T$. $S(m)$ is a recurrence establishing how $m$ changes at each level of the recurrence. Like I had mentioned above, every time we take the square root, we cut the exponent in half. So you can think of $m$ as $2^k$ in my example, e.g. $n = 2^m = 2^{2^k}$. This is why the recurrence "cuts" $m$ in half at each step. That's how they conclude $S(m) = \Theta(\lg^2 m) \implies T(n) = \Theta((\lg \lg n)^2)$. – ryan Jul 11 '17 at 01:30
  • As an alternative example of domain transformation. Let's say $k = \lg \lg n$. We can then define a recurrence of how $k$ changes w.r.t $T(n)$ as: $$R(k) = R(k-1) + \Theta(k)$$ Then we see that $R(k) = \Theta(k^2)$ and thus implies $T(n) = \Theta((\lg \lg n)^2)$. – ryan Jul 11 '17 at 01:34