Added: Here's a more concise way of phrasing the argument in (II.) and (III.), that is, of completing the proof once it is established that $f$ must be eventually monotonic (see Paramanand's argument, which I replicate in I). If $f$ is eventually monotonic and satisfies $(1)$ but not $(2)$, then $f$ must eventually be bounded away from $0$, and $f'$ must eventually have constant sign. In that case, $(1)$ implies that $\lim_{x\to\infty} f'(x)/|f(x)|^{b/a}$ exists and is equal to either $1$ or $-1$. For $b/a>1$, this can't be true; otherwise, for sufficiently large $A$,
\begin{align*}
\limsup_{x\to\infty} \left|{1\over|f(x)|^{b/a-1}} - {1\over|f(A)|^{b/a-1}}\right| &= \limsup_{x\to\infty} \left|(-b/a+1)\int_A^x {f'(t)\over |f(t)|^{b/a}}\,dt\right|\\
& = |-b/a+1|\int_A^\infty {|f'(t)|\over |f(t)|^{b/a}}\,dt \\
& \geq |-b/a + 1| \int_A^\infty {1\over2}\,dt,
\end{align*}
say. The last quantity is infinite, but the first must be finite since $f$ is bounded away from $0$, and this is our contradiction. (Like I said, this is basically a rephrasing of the argument below.)
Here is I think the (or at least a) proper generalization, together with a complete proof (borrowing, in places, from Paramanand's answer): If $f$ is differentiable on $[0,\infty)$ and there exist $a$ and $b$ with $0<a<b$ such that
\begin{align*}
\lim_{x\to\infty}{|f'(x)|^a - |f(x)|^b} = 0, \tag{1}
\end{align*}
then
\begin{align*}
\lim_{x\to\infty}f'(x) = \lim_{x\to\infty} f(x) = 0. \tag{2}
\end{align*}
This resolves China Math's problem, since $\left|f'(x)^2 + f(x)^3\right|\geq \left||f'(x)|^2-|f(x)|^3\right|$, and so we can take $a = 2$ and $b = 3$. I suspect that the statement is sharp in the sense that, whenever $0< b\leq a$, there exist functions that satisfy $(1)$ but not $(2)$ (mike's answer suggests that this is true for $b<a$ and the exponential function shows that this is certainly true when $a = b$, though it is also true that $(2)$ is satisfied when $a = b = 1$ if the quantity in $(1)$ is replaced with $f'(x) + f(x)$), but I haven't really attempted to construct them. I suppose you could also ask what happens when $a$ and $b$ are permitted to be negative, but I'm tired.
Anyway, to prove this, I'll follow Paramanand's lead to show (I.) that a function satisfying $(1)$ but not $(2)$ must be eventually monotonic; (II.) that such a function must be unbounded; and (III.) that no function satisfying $(1)$ can be unbounded. This last step—that no unbounded, monotonic function can satisfy $(1)$ but not $(2)$—is the only thing that Paramanand didn't get around to proving, and it basically boils down to the proposition that there is no unbounded, monotonic function $f$ eventually satisfying $|f'|>|f|^\nu$ for some $\nu>1$. Apologies for the recitation of Paramanand's arguments toward the beginning, but I wanted to be complete.
I. If $f$ is not eventually monotonic, then, as Paramanand has pointed out, the set of points, call it $E$, at which it attains a local extremum is unbounded. Since $f'$ vanishes at each point of $E$, we may conclude that $f(x)\to0$ as $x\to\infty$ through $E$. But $\limsup_{x\to\infty}|f(x)| = \limsup_{x\in E, x\to\infty} |f(x)|$, and so it must be that, if $f$ is not eventually monotonic, $f(x)\to0$ as $x\to\infty$.
II. We may now suppose $f$ to be monotonic. If it is bounded, then $f(x)$ does tend to a limit, call it $L$, as $x\to\infty$. The equation $(1)$ then implies that $|f'(x)|$ converges to $|L|$ as $x\to\infty$. A variation on Paramanand's mean value argument then establishes that $L = 0$:
\begin{align*}
\text{$|f(x+1) - f(x)| = |f'(\xi)|$ for some $\xi\in(x,x+1)$};
\end{align*}
as $x\to\infty$, the left side of the equation vanishes while the right side necessarily tends to $L$. Thus $L = 0$, and we are done in case $f$ is bounded.
III. Finally, assume that $f$ is unbounded. (We also retain, of course, the assumption that $f$ is monotonic.) Then $|f(x)|\to\infty$ and $f$ does not vanish for sufficiently large $x$—I'm going to show that this cannot happen when the growth of $f'$ is a lot greater than that of $f$. Since $0<a<b$, we may choose $\nu$ with $1<\nu < b/a$. I claim now that $|f'(x)|>|f(x)|^\nu$ for all sufficiently large $x$. Otherwise, the set of $x$ for which
\begin{align*}
|f'(x)|^a - |f(x)|^b < |f(x)|^{a\nu} - |f(x)|^b
\end{align*}
would be unbounded, and that is not so because the left hand side is supposed to vanish in the limit $x\to\infty$, while the right hand side tends to $-\infty$ by virtue of the facts that $|f(x)|\to\infty$ with $x$ and $0 < a\nu < b$. My claim is justified.
But now consider the function $g(x) = |f(x)|^{-\nu+1}$. Because $f(x)$ doesn't vanish for sufficiently large $x$, $g$ is differentiable for sufficiently large $x$, and because $\nu>1$ and $|f(x)|\to\infty$ with $x$, we can be sure that $g(x)\to0$ as $x\to\infty$. We can also be sure, however, that for $x$ sufficiently large (large enough that $g$ is differentiable at $x$ and $|f'(x)|>|f(x)|^\nu$)
\begin{align*}
|g'(x)| = |(-\nu+1)f(x)^{-\nu}f'(x)| > |-\nu + 1|.
\end{align*}
This, together with the mean value theorem, raises a problem: if $x$ and $y$ are sufficiently large, as above, then, for some $\xi\in (x,y)$,
\begin{align*}
|g(x) - g(y)| & = |g'(\xi)(x-y)| \\
& > |-\nu + 1||x-y|.
\end{align*}
Since $|-\nu+1|>0$, the right hand side tends to infinity with $x$ if we hold $y$ fixed. That is the desired contradiction, because we should have $\lim_{x\to\infty}g(x) = 0$ and hence the left hand side should tend in the limit $x\to\infty$ to the finite quantity $|g(y)|$.