What I'm trying to prove is probably simple, but I want to know if this method specifically is valid. the goal is to show that $$\lim_{x \rightarrow \infty} f(x)=0 \ \ \ \rightarrow \ \ \ \lim_{x \rightarrow \infty} f'(x)=0$$ Assuming f is differentiable. Now $\lim_{x \rightarrow \infty} f(x)=0$ implies that for any real $h$ we have $$\lim_{x \rightarrow \infty} f(x+h)-f(x) =0$$ Now if we restrict $h$ to be non-zero, we get this by dividing both sides by $h$ $$\lim_{x \rightarrow \infty} \frac{f(x+h)-f(x)}{h}=0$$ Now define $g(h)$ as the above then $$\lim_{h \rightarrow 0} \ g(h)=0$$But $$\lim_{h \rightarrow 0} \ g(h)=\lim_{x \rightarrow \infty} f'(x)$$ and therefore $$\lim_{x \rightarrow \infty} f'(x)=0$$
-
You've switched the order of your limits, so you're saying $\lim_{x\to\infty}\lim_{h\to0}=\lim_{h\to0}\lim_{x\to\infty}$. I think you need to justify that step. You can't always switch the order. – Gregory Grant Jan 23 '16 at 11:11
-
Is that not allowed in general? – JacksonFitzsimmons Jan 23 '16 at 11:12
-
1If it was allowed in general it's certainly not obvious. So it needs to be justified. But actually it's not true in general. – Gregory Grant Jan 23 '16 at 11:13
-
Here's an example showing you cannot always switch orders: http://math.stackexchange.com/questions/15240/when-can-you-switch-the-order-of-limits – Gregory Grant Jan 23 '16 at 11:14
-
In general it would be the problem of determining if $\lim_{x \rightarrow a} \ \lim_{y \rightarrow b} f(x,y)=\lim_{y \rightarrow b} \ \lim_{x \rightarrow a} f(x,y)$ – JacksonFitzsimmons Jan 23 '16 at 11:15
-
Yes, but that's not true in absolute generality. – Gregory Grant Jan 23 '16 at 11:16
-
Yeah I just now saw your previous comment. – JacksonFitzsimmons Jan 23 '16 at 11:17
-
Consider a function that goes to zero but is wiggly, so that the derivative itself doesn't actually converge anywhere. – Gregory Grant Jan 23 '16 at 11:21
-
I actually did consider that but I couldn't think of a function that satisfies that property. Do you know of any? – JacksonFitzsimmons Jan 23 '16 at 11:26
2 Answers
Consider something like $f(x)=\min\{\sin x,1/x\}$ where $\sin x\geq0$ and $\max\{\sin x,-1/x\}$ where $\sin x<0$. That function won't be differentiable everywhere, in particular it won't be differentiable where the two curves $\sin x$ and $1/x$ meet. But you can imagine smoothing out those corners so that it is differentiable everywhere, since those are just a countable discrete set of points. Let $g$ be $f$ smoothed at those points.
Then the function $g$ would have limit zero as $x\to\infty$, but the derivative would equal one at every multiple of $2\pi$ so the derivative will not converge to anywhere as $x\to\infty$.
The problem with your proof is that you reverse the order of the two limits, you cannot always do that.

- 14,874
-
-
No problem. It wasn't exactly right as I first posted it but I edited it I think it's right now. – Gregory Grant Jan 23 '16 at 11:30
I liked The idea, but the derivate is when h goes to zero, not to infinity.
I'd try to proof through formal definition, as real analysis tells us to do. Your question is if what you did is right, so I won't proof to you right now, but if you prefer I proof, just tell me.
The way I'd take is: Get the definition of the derivate and the limit of f(x) to 0. So manipulate expressions to rewrite the initial limit into the derivate. (Using epsilons and deltas...)

- 153
-
It turns out the result is not true, we found a counter-example (see the other answer I posted) and the mistake is when he switches the order of the two limits. – Gregory Grant Jan 23 '16 at 12:07
-
Oh, it's true. I didn't pay attention totally to your answer because I was at street. Sorry. – Enrique René Jan 23 '16 at 14:23