I didn't find this result in the literature and therefore I ask you to review my proof on it:
Given $f:]a,\infty[ \to \mathbb{R} \in C^1(]a,\infty[)$ and assume that $\lim_{t \to +\infty} f(t) = L \in \mathbb{R}$.
Proof that there exists a divergent sequence $t_n \to +\infty$ such that $f'(t_n) \to 0$.
Proof:
Since $\lim_{t \to +\infty} f(t) = L \in \mathbb{R}$ we have that $\forall \epsilon > 0. \exists M > 0. x > M \implies |f(x) - L| < \epsilon$.
Taking $\epsilon_n = \frac{1}{n}$ we get $M_n$ such that, applying mean value theorem: $$|f'(\xi_n)| = |f'(\xi_n)||(M_n+1)-(M_n+2)| = |f(M_n+1) - f(M_n+2)| = $$ $$= |f(M_n+1) - L + L - f(M_n+2)| \le |f(M_n+1) - L| + |L - f(M_n+2)| \le \frac{2}{n}$$
and clearly, $\{f'(\xi_n)\} \to 0$. Where $\xi_n$ is an appropriate point between $M_n+1$ and $M_n+2$.
It remains to guarantee that $\xi_n \to \infty$. To do so I think a strategy would be to redefine $M_n$ as $M_0' = M_0$ and $M_n' = min\{M_{n-1}+1,M_n\}$.
Please let me know whether this proof is right or any improvements that can be made to it.