Growth and time complexity are not the same, though in this case they probably have similar values (up to a multiplicative constant). The recurrences for the time complexity are
$$
\begin{align*}
F(x) &= \begin{cases} \Theta(1) & x < 1, \\ F(x-1) + G(x) + \Theta(1) & x \geq 1. \end{cases} \\
G(x) &= \begin{cases} \Theta(1) & x < 2, \\ F(x) + G(\lfloor x/2 \rfloor) + \Theta(1) & x \geq 2. \end{cases}
\end{align*}
$$
If you're uncomfortable with the notation $\Theta(1)$, mentally replace it with $1$. The difference between these recurrences and the ones for $f,g$ is the addition $\Theta(1)$ term in the inductive case (second lines above).
Ignoring the floors for the moment, we have
$$ g(x) = f(x-1) + f(x/2-1) + f(x/4-1) + \cdots, $$
the sum ending when the argument is below $1$. Therefore
$$ f(x) = f(x-1) + g(x) = 2f(x-1) + f(x/2-1) + f(x/4-1) + \cdots. $$
The recurrence $f'(x) = 2f'(x-1)$ with a base case $f'(0) = 1$ already has the solution $f'(x) = 2^x$, so we know that $f(x) = \Omega(2^x)$. Since $2^x$ grows so fast, we know that $f(x/2-1) + f(x/4-1) + \cdots \ll 2f(x-1)$, and this shows for example that $f(x) = O((2+\epsilon)^x)$ for all $\epsilon > 0$.
At this point one can probably guess and prove an asymptotic formula for $f$, but I'll leave that for others to do. Given $f$, it should be easy to estimate $g$. Indeed, it seems probable that $g = \Theta(f)$.
Update: Experiments show that
$$
\begin{align*}
\frac{f(n)}{2^n} &\longrightarrow 1.9635473337171197318620139352\ldots\\
\frac{g(n)}{2^n} &\longrightarrow 0.98177366685855986593100696758\ldots
\end{align*}
$$
Let $\alpha = \lim_{n \to \infty} f(n)/2^n$.
Even more accurately, we get (for $n$ divisible by $4$)
$$
\begin{align*}
f(n) &= \alpha \left( 2^n - 2^{n/2} + \frac{11}{28} 2^{n/4} + O(2^{n/8}) \right), \\
g(n) &= \alpha \left( \frac{1}{2} 2^n - \frac{1}{4} 2^{n/2} + \frac{1}{14} 2^{n/4} + O(2^{n/8}) \right).
\end{align*}
$$
The curious reader can try to prove these expansions, up to the exact value of the constant $\alpha$.
Indeed, let $f'(x) = f(x)/2^x$. Then
$$ f'(x) = f'(x-1) + 2^{\lfloor x/2 \rfloor - 1 - x} f'(\lfloor x/2 \rfloor - 1) + \cdots. $$
Easy induction shows that $f'(x) = O(x)$ and so
$$ f'(x) = f'(x-1) + O\left(\frac{x\log x}{2^{x/2}}\right). $$
Since the series $\sum_{x \geq 1} (x\log x)/2^{x/2}$ converges and the sequence $f'(x)$ is increasing, $f'(x)$ tends to some limit. Using the estimate $f'(x) = \Theta(1)$ and being more careful, we obtain
$$ f'(x) = f'(x-1) + \Theta(2^{-x/2} + 2^{-3x/4} + \cdots) = f'(x-1) + \Theta(2^{-x/2}). $$
Let $C = \lim_{t \to \infty} f'(t)$. Then
$$ f'(x) = C - \Theta\left(\sum_{t=x+1}^\infty 2^{-t/2}\right) = C - \Theta(2^{-x/2}). $$
This shows that
$$ f(x) \sim C 2^x - \Theta(2^{x/2}), $$
which in turn implies that
$$ g(x) = f(x) - f(x-1) = \frac{C}{2} 2^x \pm O(2^{x/2}). $$
The rest of the terms in the asymptotic expansion depend on the LSBs of $x$, and are left to the interested reader.
f(x)
? What techniques do you know for finding the running time of a function? Do you know about recurrence relations? Please edit the question to show us what you've tried and where you got stuck. This is not a "homework help" site where you can just copy-paste your exercise and have us solve it for you; we expect you to make a serious effort on your own. However, if you show us what you've tried and have a specific question about it, we might be able to help. – D.W. Nov 24 '13 at 05:25f(x)
, as a function ofx
(which is just another way to say, they are asking you to find the time complexity off(x)
). – D.W. Nov 24 '13 at 05:26