3

consider following code

int f(int x)
{
  if(x<1) return 1;
  else return f(x-1)+g(x);
}
int g(int x)
{
  if(x<2) return 1;
  else return f(x-1)+g(x/2);
}

Questions:

How do I find the growth of f(x), being that it contains a recursion to another function?

Are growth and time complexity of a a function same thing? Are they the same for f(x)?

user1917769
  • 301
  • 5
  • 12
  • What do you think? What have you tried, to try to find f(x)? What techniques do you know for finding the running time of a function? Do you know about recurrence relations? Please edit the question to show us what you've tried and where you got stuck. This is not a "homework help" site where you can just copy-paste your exercise and have us solve it for you; we expect you to make a serious effort on your own. However, if you show us what you've tried and have a specific question about it, we might be able to help. – D.W. Nov 24 '13 at 05:25
  • P.S. Regarding your second question, this question might help: http://cs.stackexchange.com/q/192/755. They are probably asking you to compute the running time of f(x), as a function of x (which is just another way to say, they are asking you to find the time complexity of f(x)). – D.W. Nov 24 '13 at 05:26
  • 2
    Has anyone noticed that there is mutual recursion going on? This is not your average question. – Yuval Filmus Nov 26 '13 at 09:05
  • 2
    "Are growth and time complexity of a a function same thing?" In general, they are different. The growth of the function is about how big the value of the function gets as its input increases; the complexity is the amount of resources (typically time or memory) required to compute it. – David Richerby Nov 26 '13 at 09:29

1 Answers1

4

Growth and time complexity are not the same, though in this case they probably have similar values (up to a multiplicative constant). The recurrences for the time complexity are $$ \begin{align*} F(x) &= \begin{cases} \Theta(1) & x < 1, \\ F(x-1) + G(x) + \Theta(1) & x \geq 1. \end{cases} \\ G(x) &= \begin{cases} \Theta(1) & x < 2, \\ F(x) + G(\lfloor x/2 \rfloor) + \Theta(1) & x \geq 2. \end{cases} \end{align*} $$ If you're uncomfortable with the notation $\Theta(1)$, mentally replace it with $1$. The difference between these recurrences and the ones for $f,g$ is the addition $\Theta(1)$ term in the inductive case (second lines above).

Ignoring the floors for the moment, we have $$ g(x) = f(x-1) + f(x/2-1) + f(x/4-1) + \cdots, $$ the sum ending when the argument is below $1$. Therefore $$ f(x) = f(x-1) + g(x) = 2f(x-1) + f(x/2-1) + f(x/4-1) + \cdots. $$ The recurrence $f'(x) = 2f'(x-1)$ with a base case $f'(0) = 1$ already has the solution $f'(x) = 2^x$, so we know that $f(x) = \Omega(2^x)$. Since $2^x$ grows so fast, we know that $f(x/2-1) + f(x/4-1) + \cdots \ll 2f(x-1)$, and this shows for example that $f(x) = O((2+\epsilon)^x)$ for all $\epsilon > 0$.

At this point one can probably guess and prove an asymptotic formula for $f$, but I'll leave that for others to do. Given $f$, it should be easy to estimate $g$. Indeed, it seems probable that $g = \Theta(f)$.

Update: Experiments show that $$ \begin{align*} \frac{f(n)}{2^n} &\longrightarrow 1.9635473337171197318620139352\ldots\\ \frac{g(n)}{2^n} &\longrightarrow 0.98177366685855986593100696758\ldots \end{align*} $$ Let $\alpha = \lim_{n \to \infty} f(n)/2^n$. Even more accurately, we get (for $n$ divisible by $4$) $$ \begin{align*} f(n) &= \alpha \left( 2^n - 2^{n/2} + \frac{11}{28} 2^{n/4} + O(2^{n/8}) \right), \\ g(n) &= \alpha \left( \frac{1}{2} 2^n - \frac{1}{4} 2^{n/2} + \frac{1}{14} 2^{n/4} + O(2^{n/8}) \right). \end{align*} $$ The curious reader can try to prove these expansions, up to the exact value of the constant $\alpha$.

Indeed, let $f'(x) = f(x)/2^x$. Then $$ f'(x) = f'(x-1) + 2^{\lfloor x/2 \rfloor - 1 - x} f'(\lfloor x/2 \rfloor - 1) + \cdots. $$ Easy induction shows that $f'(x) = O(x)$ and so $$ f'(x) = f'(x-1) + O\left(\frac{x\log x}{2^{x/2}}\right). $$ Since the series $\sum_{x \geq 1} (x\log x)/2^{x/2}$ converges and the sequence $f'(x)$ is increasing, $f'(x)$ tends to some limit. Using the estimate $f'(x) = \Theta(1)$ and being more careful, we obtain $$ f'(x) = f'(x-1) + \Theta(2^{-x/2} + 2^{-3x/4} + \cdots) = f'(x-1) + \Theta(2^{-x/2}). $$ Let $C = \lim_{t \to \infty} f'(t)$. Then $$ f'(x) = C - \Theta\left(\sum_{t=x+1}^\infty 2^{-t/2}\right) = C - \Theta(2^{-x/2}). $$ This shows that $$ f(x) \sim C 2^x - \Theta(2^{x/2}), $$ which in turn implies that $$ g(x) = f(x) - f(x-1) = \frac{C}{2} 2^x \pm O(2^{x/2}). $$ The rest of the terms in the asymptotic expansion depend on the LSBs of $x$, and are left to the interested reader.

Yuval Filmus
  • 276,994
  • 27
  • 311
  • 503