9

Problem:
If $f(x)$ is continous at $x=0$, and $\lim\limits_{x\to 0} \dfrac{f(ax)-f(x)}{x}=b$, $a, b$ are constants and $|a|>1$, prove that $f'(0)$ exists and $f'(0)=\dfrac{b}{a-1}$.

This approach is definitely wrong:

\begin{align} b&=\lim_{x\to 0} \frac{f(ax)-f(x)}{x}\\ &=\lim_{x\to 0} \frac{f(ax)-f(0)-(f(x)-f(0))}{x}\\ &=af'(0)-f'(0)\\ &=(a-1)f'(0) \end{align}

I will show you a case why this approach is wrong:

\[f(x)= \begin{cases} 1,&x\neq0\\ 0,&x=0 \end{cases}\] $\lim_{x\to0}\dfrac{f(3x)-f(x)}{x}=\lim_{x\to0} \dfrac{1-1}{x}=0$
but $\lim_{x\to0}\dfrac{f(3x)}{x}=\infty$,$\lim_{x\to0}\dfrac{f(x)}{x}=\infty$

Does anyone know how to prove it? Thanks in advance!

3 Answers3

11

This is a tricky question and the solution is somewhat non-obvious. We know that $$\lim_{x \to 0}\frac{f(ax) - f(x)}{x} = b$$ and hence $$f(ax) - f(x) = bx + xg(x)$$ where $g(x) \to 0$ as $x \to 0$. Replacing $x$ by $x/a$ we get $$f(x) - f(x/a) = bx/a + (x/a)g(x/a)$$ Replacing $x$ by $x/a^{k - 1}$ we get $$f(x/a^{k - 1}) - f(x/a^{k}) = bx/a^{k} + (x/a^{k})g(x/a^{k})$$ Adding such equations for $k = 1, 2, \ldots, n$ we get $$f(x) - f(x/a^{n}) = bx\sum_{k = 1}^{n}\frac{1}{a^{k}} + x\sum_{k = 1}^{n}\frac{g(x/a^{k})}{a^{k}}$$ Letting $n \to \infty$ and using sum of infinite GP (remember it converges because $|a| > 1$) and noting that $f$ is continuous at $x = 0$, we get $$f(x) - f(0) = \frac{bx}{a - 1} + x\sum_{k = 1}^{\infty}\frac{g(x/a^{k})}{a^{k}}$$ Dividing by $x$ and letting $x \to 0$ we get $$f'(0) = \lim_{x \to 0}\frac{f(x) - f(0)}{x} = \frac{b}{a - 1} + \lim_{x \to 0}\sum_{k = 1}^{\infty}\frac{g(x/a^{k})}{a^{k}}$$

The sum $$\sum_{k = 1}^{\infty}\frac{g(x/a^{k})}{a^{k}}$$ tends to $0$ as $x \to 0$ because $g(x) \to 0$. The proof is not difficult but perhaps not too obvious. Here is one way to do it. Since $g(x)\to 0$ as $x \to 0$, it follows that for any $\epsilon > 0$ there is a $\delta > 0$ such that $|g(x)| < \epsilon$ for all $x$ with $0 <|x| < \delta$. Since $|a| > 1$ it follows that $|x/a^{k}| < \delta$ if $|x| < \delta$ and therefore $|g(x/a^{k})| < \epsilon$. Thus if $0 < |x| < \delta$ we have $$\left|\sum_{k = 1}^{\infty}\frac{g(x/a^{k})}{a^{k}}\right| < \sum_{k = 1}^{\infty}\frac{\epsilon}{|a|^{k}} = \frac{\epsilon}{|a| - 1}$$ and thus the sum tends to $0$ as $x \to 0$.

Hence $f'(0) = b/(a - 1)$.


BTW the result in question holds even if $0 < |a| < 1$. Let $c = 1/a$ so that $|c| > 1$. Now we have $$\lim_{x \to 0}\frac{f(ax) - f(x)}{x} = b$$ implies that $$\lim_{t \to 0}\frac{f(ct) - f(t)}{t} = -bc$$ (just put $ax = t$). Hence by what we have proved above it follows that $$f'(0) = \frac{-bc}{c - 1} = \frac{b}{a - 1}$$ Note that if $a = 1$ then $b = 0$ trivially and we can't say anything about $f'(0)$. And if $a = -1$ then $f(x) = |x|$ provides a counter-example. If $a = 0$ then the result holds trivially by definition of derivative. Hence the result in question holds if and only if $|a| \neq 1$.

  • How "adding such equation" do you get the left side you did? Do you mean when adding over $;k;$ from $;1;$ to $;n;$? Then, you left $;n\to\infty;$ , but why would $;x\to0\implies;$ the second summand in the right tends to zero? Even if the right sum converges for all $;x>R;$ , for some $;R>0;$, how it depends on $;x;$ could makea difference. – DonAntonio Jul 19 '16 at 09:24
  • Yes adding over $k = 1$ to $n$. The infinite sum on right tends to $0$ as $x \to 0$. I have kept it as exercise for reader. But It appears I need to prove it. Wait for my updated answer. – Paramanand Singh Jul 19 '16 at 09:27
  • @DonAntonio: see my updated answer. – Paramanand Singh Jul 19 '16 at 09:32
  • Thank you. Yet I think this is way too convoluted for an answer, and it may be this exercise is way before infinite series is studied... – DonAntonio Jul 19 '16 at 09:32
  • @DonAntonio: Agree this question is hard and not suitable for beginners in calculus. I knew the solution from past experience. see this answer http://math.stackexchange.com/a/568871/72031 – Paramanand Singh Jul 19 '16 at 09:33
  • @Pa I think the other answer is more elementary and uses key arguments from limits and continuity without resourcing to series. Yet the OP already accepted this answer so it must satisfy him/her. I don't understand, though, why he/she downvoted that other answer... – DonAntonio Jul 19 '16 at 09:35
  • 1
    @DonAntonio: The other answer only shows that if $f'(0)$ exists then it must be $b/(a - 1)$. But it does not show why $f'(0)$ exists. One of the downvotes for that answer is mine. In mathematics, correctness is more important than anything else. – Paramanand Singh Jul 19 '16 at 09:38
  • Either you didn't follow what I wrote there or you just didn't understand it: the answer does not assume $;f'(0);$ exists: it proves it does and shows (well, hints towards) what the value will be, even using what the OP) did in her/his question. If you downvoted then you should at least rebut my comments there, imo, otherwise I feel you have downvoted a very nice and correct answer. – DonAntonio Jul 19 '16 at 09:41
1

In this quickly closed question the case $a=2$ is considered, which allows the following simpler solution:

Define $g(x):=f(x)- bx-f(0)$. Then $g$ is continuous at $0$, $g(0)=0$, and $$\lim_{x\to0}{g(2x)-g(x)\over x}=0\ .$$ We have to prove that $g'(0)=\lim_{x\to0}{g(x)\over x}=0$.

Let an $\epsilon>0$ be given. Then there is a $\delta>0$ such that $|g(2t)-g(t)|\leq\epsilon |t|$ for $0<t\leq\delta$. Assume $|x|\leq\delta$. Then for each $N\in{\mathbb N}$ one has $$g(x)=\sum_{k=1}^N\bigl(g(x/2^{k-1})-g(x/2^k)\bigr)+g(x/2^N)\ ,$$ and therefore $$\bigl|g(x)\bigr|\leq\sum_{k=1}^N\epsilon\,{|x|\over 2^k} \ +g(x/2^N)\leq\epsilon|x|+g(x/2^N)\ .$$ Since $N\in{\mathbb N}$ is arbitrary we in fact have $\bigl|g(x)\bigr|\leq \epsilon|x|$, or $\left|{g(x)\over x}\right|\leq\epsilon$, and this for all $x\in\>]0,\delta]$.

-2

Hint: You're very close.

Write the expression as $$a\frac{f(ax)-f(0)}{ax}-\frac{f(x)-f(0)}{x}$$ Note that $x\to 0$ if and only if $ax\to 0$ (since $a\neq 0$).

Can you see it from this?

MPW
  • 43,638
  • I don't know why $f'(0)$ exists. – Jonas Meyer Jul 19 '16 at 08:49
  • @DonAntonio: You are told that this limit exists. My expression shows that the know limit is equal to the limit of $(a-1)$ times the difference quotient needed to find the derivative. Since $a\neq 1$, the limit of the difference quotient must exist as well. There is a basic theorem that $\lim cf =c\lim f$ if $c\neq 0$. – MPW Jul 19 '16 at 08:55
  • 2
    $\lim_{x\to0}\frac{f(0)-f(x)}{x}$ is not the same as $\lim_{x\to0}\frac{f(0)-f(ax)}{ax}$ if we don't know the existence of $f'(0)$,so we can't put it together times $(a-1)$ – Spaceship222 Jul 19 '16 at 09:02
  • 1
    @MPW I'm rewriting this since I think I understand your comment above better now though it still is pretty messy (the limits are minus the usual one). – DonAntonio Jul 19 '16 at 09:03
  • 1
    I think, after making some order, both in the above answer and, in particular, in my mind, that this answer is correct: we can write$$b=\lim_{x\to0}\frac{f(ax)-f(x)}x=\lim_{x\to0}\left[a\frac{f(ax)-f(0)}{ax}-\frac{f(x)-f(0)}x\right]=\lim_{t\to0}(a-1)\frac{f(t)-t(0)}t$$because when $;x\to0;$ both limits (without the constant $;a;$) within the parentheses are the same, whether it exists or not, because $;f;$ is given continuous at zero and thus it is the same to take $;\lim f(x);$ or $;\lim f(ax);$ when $x\to0$. The rightmost expression, compared to the left side, answers all +1 – DonAntonio Jul 19 '16 at 09:10
  • Cont. Meaning: when you observe both extremes in the expression in my above comment, it is clear that this means the limit defining $;f'(0);$ exists and equals the correct $;\frac b{a-1};$ . Thus the answer deserves not only to have that downvote removed (even if the answer was wrong it is not nice, imo., to rush to downvote. One can argue and then at least wait for some time...and then, after a nice ammount of time has passed, if no change is made in the answer then downvote...and even then, what for??) but upvoted and, imo, even accepted. – DonAntonio Jul 19 '16 at 09:14
  • 1
    I agree here with @Spaceship222: You must prove existence of $f'(0)$ by other means. see my answer. Your answer as it stands is incorrect. – Paramanand Singh Jul 19 '16 at 09:20
  • If we have $;K\lim_{x\to x_0} T(X)=b;$ , with $;K,b;$ different from zero, this means the limit exists and equals $;b/K;$ . This is a proof of the existence of the limit. Why would the poster here, or anyone else, "have to prove the existence by other means" if this one is correct? – DonAntonio Jul 19 '16 at 09:26
  • @Spaceship222 Both limits are the same as explained in one of my comments above: whether the limit exists or not, it is the same for a continuous function at zero to take $;\lim_{x\to0} \frac{f(x)-f(0)}x;$ or $;\lim_{x\to0}\frac{f(ax)-f(0)}{ax},;;a\neq0;$ : one of the limits exists finitely iff the other one one does. – DonAntonio Jul 19 '16 at 09:30
  • I erased my first comment, which was upvoted, so that no more readers can be lead to believe this answer is wrong: it is correct ( unless otherwise eventually proved...:) ) and, imo, very nice. – DonAntonio Jul 19 '16 at 09:38
  • 1
    @DonAntonio: The following is correct: $\lim_{x \to 0}\dfrac{f(x) - f(0)}{x}$ exists if and only if $\lim_{x \to 0}\dfrac{f(ax) - f(0)}{ax}$ exists and in case both exist they are equal. But unless one of them exists we can not apply laws of algebra of limits to combine them like the way done in the answer. – Paramanand Singh Jul 19 '16 at 09:41
  • @Param: all the difference is in the word "exist" you wrote there. What I claim is that $$\lim_{x\to0}\frac{f(x)-f(0)}x,;;\lim_{x\to0}\frac{f(ax)-f(0)}{ax};$$ are both ""the same limit", whether it exists or not. Why? Because we're given that $;f;$ is continuous at zero (did you notice you didn't use this explicitly in your answer...?!), so taking the limit when the variable tends to zero can be done anyway we want! The only algebra used here is to "add" to equal expressions, sum which already equals something finite ($b$) on the left hand. – DonAntonio Jul 19 '16 at 09:47
  • 1
    @DonAntonio: it would be best if you could join the chat http://chat.stackexchange.com/rooms/42696/discussion-between-paramanand-singh-and-donantonio Moreover I used the continuity of $f$ at $0$. This has nothing to do with the limit we are talking about. – Paramanand Singh Jul 19 '16 at 09:48
  • 2
    @DonAntonio: Let $F(x) = 1/x$ and $a > 0$. Then both the limits $\lim_{x \to 0^{+}}F(x)$ and $\lim_{x \to 0^{+}}F(ax)$ don't exist and yet $$\lim_{x \to 0^{+}}aF(ax) - F(x) = 0$$ so one should be very careful about the conditions under which laws of algebra of limits work. – Paramanand Singh Jul 19 '16 at 10:02
  • @DonAntonio I didn't downvote this answer.And I am not fully convinced by your approach but I don't know why(My guess:before you carry out the limit process,you can't put both limit together although they are the same limit when $x\to 0$ limit process is carried out.Or your approach simply disobeys laws of algebra of limits) My counterexample in the question whicn is discontinuous at $x=0$ of course can't disprove your approach. Anyway,thanks to your comment on this question! – Spaceship222 Jul 19 '16 at 12:39
  • @DonAntonio: I rewrote the expression in the usual order. Not sure why I chose the original order. Answer is effectively unchanged, just put a little better. – MPW Jul 19 '16 at 12:39
  • 1
    It appears (from your last edit) that you are still not convinced that your approach has a basic flaw. My last comment presents a counter-example to your approach. You can also have a look at the chat http://chat.stackexchange.com/rooms/42696/discussion-between-paramanand-singh-and-donantonio – Paramanand Singh Jul 19 '16 at 15:11