5

Theorem

For any real functions $f,g \in C^1$ such that $f(0) = g(0) = 0$ and $f'(0) = g'(0) = 1$ and $x$ is strictly between $f(x)$ and $g(x)$ for any $x \ne 0$:

  $f,g$ are invertible on some open neighbourhood of $0$

  $\dfrac{f(x)-g(x)}{g^{-1}(x)-f^{-1}(x)} \to 1$ as $x \to 0$

Questions

What is the simplest proof you can think of? I've given mine below.

Motivation

This theorem was inspired by user8286's solution to a special case, and I wanted to find weaker conditions on the functions under which the limit in his proof would hold.

user21820
  • 57,693
  • 9
  • 98
  • 256
  • see http://mathoverflow.net/questions/20696/a-question-regarding-a-claim-of-v-i-arnold – Paramanand Singh May 29 '14 at 11:24
  • @ParamanandSingh: Did you read user8286's solution? It is essentially the same as the one given in your link, which does not apply to this problem as here I don't assume the existence of power series. – user21820 May 29 '14 at 11:57

1 Answers1

4

[Edit: The first half was simplified thanks to Paramanand Singh.]

Proof

For any real functions $f,g \in C^1$ such that $f(0) = g(0) = 0$ and $f'(0) = g'(0) = 1$ and $x$ is strictly between $f(x)$ and $g(x)$ for any $x \ne 0$:

  Let $d > 0$ such that $f'(x) > 0$ for any $x \in (-d,d)$

  Then $f$ is strictly increasing on $(-d,d)$ by Mean value theorem

  Similarly $g$ is strictly increasing on some open neighbourhood of $0$

  Therefore $f,g$ are invertible on some open neighbourhood of $0$ because $f,g \in C^1$

  WLOG $f(x) > x > g(x)$ for any $x \ne 0$ because $f,g$ are continuous and by symmetry

  As $x \to 0$:

    $\dfrac{f(x)-x}{x-f^{-1}(x)} = f'(a)$ for some $a \in (f^{-1}(x),x)$ by Mean value theorem

    $\dfrac{x-g(x)}{g^{-1}(x)-x} = g'(b)$ for some $b \in (x,g^{-1}(x))$ by Mean value theorem

    $f'(a) \to f'(0) = 1$ because $x \to 0$ and $f^{-1}(x) \to 0$

    $g'(b) \to g'(0) = 1$ because $x \to 0$ and $g^{-1}(x) \to 0$

    Therefore $\dfrac{f(x)-g(x)}{g^{-1}(x)-f^{-1}(x)} \to 1$ because $\dfrac{p+q}{r+s}$ is between $\dfrac{p}{r}$ and $\dfrac{q}{s}$ for any $p,q,r,s > 0$

user21820
  • 57,693
  • 9
  • 98
  • 256
  • Consider $f(x)=x+x^3\sin(1/x)$ and verify your result on this function. – Omran Kouba May 26 '14 at 18:00
  • @OmranKouba: I just verified it. I don't see any mistake in my proof, so if there is one please point it out! – user21820 May 27 '14 at 01:40
  • At least the tangent function doesn't satisfy your inequality that comes ofter :"WLOG". – Omran Kouba May 27 '14 at 06:10
  • @OmranKouba: I'm not sure what you mean. The outermost scope is "If ... and $x$ is strictly between $f(x)$ and $g(x)$ for any $x \ne 0$", which implies that either $f(x) > x$ for any $x > 0$ or $f(x) < x$ for any $x > 0$, and likewise for $x < 0$. And we can swap $f,g$ on one side of $0$ if necessary. – user21820 May 27 '14 at 07:09
  • @OmranKouba: So in fact this theorem says nothing about $x \mapsto x+x^3 \sin(1/x)$, but it just happens to have the correct limit. If $f$ is allowed to cross the symmetry line in every open neighbourhood of $0$, then the limit may not hold. – user21820 May 27 '14 at 07:14
  • @OmranKouba: I meant that if we ignore the singularities the limit still happens to be correct in some cases, but not necessarily, which is why I had the extra condition. It also makes this theorem inapplicable to the original problem that inspired it.. – user21820 May 27 '14 at 07:19
  • So, I think, a hypothesis of high order smoothness, like analyticity, is justifiable, even if there are some examples where the conclusion does hold without that being a consequence of a general theorem. – Omran Kouba May 27 '14 at 07:25
  • @OmranKouba: I agree that analyticity is nice, which is why I chose user8286's answer. But I think I know how to eliminate that condition and replace it with simply $h(x)$ is between $f(x)$ and $g(x)$ for some twice differentiable function $h$. – user21820 May 27 '14 at 07:53
  • Very nice proof although as you have already mentioned it does not apply to any odd function like $\sin(\tan x)$ or the function from Omran Kouba $f(x) = x + x^{3}\sin(1/x)$. However I have issue where you prove that $f(x)$ is increasing in neighborhood of $0$ using $f'(0) = 1$. Note that a positive derivative at point does not imply its increasing nature in a neighborhood. It does imply the increasing nature if the derivative is positive as well as continuous at point. – Paramanand Singh May 30 '14 at 08:31
  • @ParamanandSingh: Yes, which is why the outermost scope is "For any real $f,g \in C^1$...". – user21820 May 30 '14 at 09:43
  • @user21820: If you assume that $f'$ is continuous then you see that $f'$ maintains its sign in neighborhood of $0$ and hence is positive and thus $f$ is increasing in that neighborhood. Your approach of sequences $(x_{n}, y_{n})$ seems more involved and tricky. – Paramanand Singh May 30 '14 at 11:51
  • @ParamanandSingh: You're right! I'll put that in. – user21820 May 30 '14 at 12:05