Here's how i would prove this. Since we have that $\lim_{x \to \infty}f(x)=a$ this implies that $\lim_{x \to \infty}f(x + 1) - f(x)=0$ By mean value theorem we have that $\lim_{x \to \infty}\frac{f(x)-f(x-1)}{x-(x-1)}=\lim_{c \to \infty}f'(c)=0$. QED However, quick search shows that $f(x)=\sin(x^a)/x$ is a counter example. What gives?
Asked
Active
Viewed 133 times
0
-
2You only have a sequence of $c$ values for which $f'(c)$ approaches $0$. You certainly don't have your limit statement. – Ted Shifrin Aug 13 '14 at 00:02
-
The counter-example does not have $f(x)\to a$. I think you mean $f(x) = a + \frac{\sin(x^{k})}{x}$ for some $k\geq 2$. – Winther Aug 13 '14 at 00:04
-
@Winther I think the original function denotes some arbitrary constant, as such the first condition is satisfied since $f \to 0$. – ThisIsNotAnId Aug 13 '14 at 00:21
-
The change of variables is not justified in your proof. Indeed, $\lim_{x \to \infty}\frac{f(x)-f(x-1)}{x-(x-1)} \not = \lim_{c \to \infty}f'(c)$. Moreover, the proof will be valid only for a single variable; in your counterexample, $a$ is to be considered a separate variable even though it denotes a constant. – ThisIsNotAnId Aug 13 '14 at 00:25
-
There is a simpler proof; consider what you may claim about $f$ given $\lim_{x \to \infty} f(x) = 0$. You will have to assume the function is differentiable, therefore continuous. – ThisIsNotAnId Aug 13 '14 at 00:29
1 Answers
0
Suppose that $a,b\in \mathbb{R}$ with $a\neq b$.
Then mean value theorem states that; if f is continuous on [a,b] and differentiable on (a,b), then there is an $c\in (a,b)$ such that, $$f(b)-f(a)=f'(c).(b-a)$$ You are using mean value theorem the wrong way.
There is an elegant answer to this with l'Hopital trick. Look at this post: Limit of the derivative of a function as x goes to infinity
Edit: Proof in above link assumes $\lim _{x\to \infty }f'$ must exist.
-
But note that in the linked problem there is an extra assumption, namely that $\lim f'(x)$ exists. – David Aug 13 '14 at 00:31
-
@David Yes, but in the solution, $\lim_{x\rightarrow \infty }f'(x)$ not need to be exist. – Alistair Aug 13 '14 at 00:38
-
I only looked at the first three answers but they all say near the start, "assume $\lim f'(x)=L$" or something similar. Besides, if you don't assume the limit exists then the counterexample suggested in Winther's comment on this question is correct, isn't it? – David Aug 13 '14 at 00:45
-
@David Look at the answer with l'Hopital trick (Bill Dubuque's answer) and comments to that answer. – Alistair Aug 13 '14 at 00:52
-
OK, very nice, but he is still assuming that $f+f'$ has a limit: that means it does not apply to the present problem, in which it is given only that $f$ has a limit. And Winther's counterexample is still good I think. – David Aug 13 '14 at 01:02
-
@David Yes, Winther's counterexample shows that $\lim\limits_{x\to\infty} f'(x)$ need not exist if the limit of $f(x)$ exists. You think right. – Daniel Fischer Aug 13 '14 at 01:12
-
@David Now you mention it, i took a closer look. I think they are talking about equality $\lim(f+f')=\lim f$ when they are mentioning the non-existence of $f'$. This table (http://i.stack.imgur.com/qS98q.jpg) -which has a link from comments- summarises the situation very clearly. If $f'$ fails to exist then $\lim(f+f')$ fails to exist. If $f'$ exist then it must be 0. I think you are right. $f'$ must exist. – Alistair Aug 13 '14 at 01:14