8

I want to prove that the time complexity of an algorithm is polylogarithmic in the scale of input.

The recurrence relation of this algorithm is $T(2n) \leq T(n) + T(n^a)$, where $a\in(0,1)$.

It seems that $T(n) \leq \log^{\beta}{n}$ for some $\beta$ depends on $a$. But I can't prove this inequality. How to solve this recurrence relation?

I just want to get an upper bound polylogarithmic in n.

Qiang Li
  • 81
  • 2

1 Answers1

4

Your guess is wrong. In fact, it is not hard to show that assuming $T(n) > 0$ for all $n > 0$, then $T(n) = \Omega(\log^k n)$ for all $k$. Indeed, for this to hold we need that for large enough $n$ we would have $$ (1+a^k) \log^k n = \log^k n + \log^k n^a \geq \log^k (2n), $$ or $\sqrt[k]{1+a^k} \log n \geq \log n + \log 2$, which holds as long as $[\sqrt[k]{1+a^k}-1]\log n \geq \log 2$, and in particular for large enough $n$.

What is the correct order of growth of $T(n)$? To try and find out, write $S(n) = T(2^n)$. The recurrence now becomes $$ S(n+1) = S(n) + S(an). $$ For large $n$, we would expect $S(n+1) - S(n)$ to be very close to $S'(n)$, and so heuristically we would expect that $S$ satisfy $S'(n) = S(an)$. This equation seems a bit hard to solve, but an approximate solution is $S(n) = n^{\Theta(\log n)}$. Substituting back, we deduce that the order of growth of $T(n)$ should be something like $(\log n)^{\Theta(\log \log n)}$.

Yuval Filmus
  • 276,994
  • 27
  • 311
  • 503