22

Find all $f\in C^1(\mathbb R,\mathbb R)$ such that $f^2+(1+f')^2\leq 1$

It's quite likely the answer is $f=0$.

Note that $|f|\leq 1$ and $-2\leq f'\leq 0$.

Therefore $f$ is decreasing and bounded.

What then ? I tried contradiction, without success.

Gabriel Romon
  • 35,428
  • 5
  • 65
  • 157
  • Have you thought about the limits as $x$ approaches $\pm \infty$? Remember that its values are decreasing and lie in a bounded region. – Kyle Jun 12 '14 at 18:53
  • 1
    well $-k\arctan(x)$ with $k$ small enough it seems for me satisfies your condition. Say $-1/\pi$. Then function is from $[-1/2,1/2]$ with negative derivative $>-2$. Something like that. – Alexander Vigodner Jun 12 '14 at 18:55
  • 2
    @AlexanderVigodner with $k=0.01$ it fails at $x=13$. – Gabriel Romon Jun 12 '14 at 19:17

5 Answers5

15

The equation is equivalent to $$ f^2+2f'+f'^2\le0\tag{1} $$ Since $f^2+2f'\le0$, where $f\ne0$, we have $$ (1/f)'\ge\color{#C00000}{1/2}\tag{2} $$ If $f(x_0)=a\gt0$, then $\dfrac1f(x_0)=\dfrac1a\gt0$ and $(2)$ says that $$ \frac1f\left(x_0-\frac3a\right)\le\frac1f(x_0)-\color{#C00000}{\frac12}\frac3a\lt0\tag{3} $$ as long as $\dfrac1f$ doesn't pass to $-\infty$ in $\left[x_0-\frac3a,x_0\right]$.

In any case, on $\left[x_0-\frac3a,x_0\right]$, $\dfrac1f$ must pass through $0$, which is impossible because $f\in C^1(\mathbb{R})$.

If $f(x_0)=a\lt0$, then $\dfrac1f(x_0)=\dfrac1a\lt0$ and $(2)$ says that $$ \frac1f\left(x_0-\frac3a\right)\ge\frac1f(x_0)-\color{#C00000}{\frac12}\frac3a\gt0\tag{4} $$ as long as $\dfrac1f$ doesn't pass to $\infty$ in $\left[x_0,x_0-\frac3a\right]$.

In any case, on $\left[x_0,x_0-\frac3a\right]$, $\dfrac1f$ must pass through $0$, which is impossible because $f\in C^1(\mathbb{R})$.

Therefore, $f(x)=0$ for all $x\in\mathbb{R}$.

robjohn
  • 345,667
  • Can you elaborate a bit on your argument following $(2)$? – Alex Schiff Jun 13 '14 at 01:30
  • 1
    @AlexSchiff: Any function with a slope at least $\color{#C00000}{1/2}$ (e.g. $1/f$) must pass through the $x$-axis at a finite point. – robjohn Jun 13 '14 at 01:40
  • What is $f=0$ how you can use (2) then ? – Alexander Vigodner Jun 13 '14 at 05:18
  • @AlexanderVigodner: $f=0$ satisfies $(1)$. However, $(2)$ obviously holds wherever $f(x)\ne0$. – robjohn Jun 13 '14 at 06:12
  • Something must be adjusted since $\frac{1}{f}$ is continuous just outside the zeroes of $f$. What happens if a zero of $f$ belongs to the $(x_0-3/a,x_0)$-interval? – Jack D'Aurizio Jun 13 '14 at 10:04
  • Ok, it cannot happen. Since $f$ is decreasing and $f(x_0)>0$, any zeroes of $f$ is bigger than $x_0$. – Jack D'Aurizio Jun 13 '14 at 10:10
  • @JackD'Aurizio: All that matters is that there is a zero of $1/f$ in the interval. If $f$ has a zero, $1/f$ simply goes to $\infty$. The argument is simply that $1/f$ must pass through $0$ and therefore $f$ must go to $\infty$ in a finite interval, which cannot happen since $f\in C^1(\mathbb{R})$. – robjohn Jun 13 '14 at 12:14
  • @robjohn: to prove that $1/f$ must pass through zero, you need the continuity of $1/f$ over the whole interval; the argument do not work for continuous function (over their domain) like $\tan x$ in a neighbourhood of $\pi/2$, for example. – Jack D'Aurizio Jun 13 '14 at 12:33
  • @JackD'Aurizio: $1/f$ is $C^1$ except where it goes to $\infty$ ($f=0$). From any non-zero value of $1/f$, the differential inequality forces $1/f$ to zero in a finite time. – robjohn Jun 13 '14 at 13:31
  • @robjohn: I completely agree with you, I am simply pointing out that "$\frac{1}{f}$ changes its sign in the endpoints of this interval" does not automatically imply that $\frac{1}{f}$ is zero in an inner point, since $\frac{1}{f}$ can change its sign by passing through a discontinuity, i.e. a zero of $f$. – Jack D'Aurizio Jun 13 '14 at 14:11
  • 2
    @JackD'Aurizio: I have edited my answer to account for the case that $1/f$ may blow up inside an interval. – robjohn Jun 13 '14 at 14:28
10

Since $f(x)$ is bounded and decreasing both $\lim_{x \rightarrow \infty} f(x)$ and $\lim_{x \rightarrow -\infty} f(x) $ exist. If $f(x)$ were not identically zero, then at least one of these limits is nonzero. Say it is the first one, and call the limit $L$.

By the mean value theorem, $f(n+1) - f(n) = f'(x_n)$ for some $x_n$ between $n$ and $n + 1$. The left-hand side of this equation converges to $L - L = 0$ as $n$ goes to infinity, so we have $$\lim_{n \rightarrow \infty} f'(x_n) = 0$$ But we also have $$\lim_{n \rightarrow \infty} f(x_n) = L$$ Plugging $x_n$ into $f(x)^2 + (1 + f'(x))^2 \leq 1$ and taking limits as $n$ goes to infinity gives $L^2 \leq 0$, a contradiction.

A similar argument works if $\lim_{x \rightarrow -\infty} f(x) \neq 0$.

Zarrax
  • 44,950
9

Hints: As you mentioned, $f$ is decreasing and bounded. Think about $\lim_{n \to \infty} f(n)$. Must this limit exist? What does this imply for the limit of the derivative $f'$?

Full Solution. The function $f(x)$ is decreasing and bounded, so $\lim_{x \to \infty} f(x)=L$ for some $L \in [-1,1]$. For the sake of contradiction, we suppose $|L|>0$. To set up the contradiction, we relate $|f(x)|$ and $f'(x)$: Let $\epsilon\in (0,1]$, and suppose that we have $0 \geq f'(x) \geq -\epsilon$ for some $x \in \mathbb{R}$. Then \begin{align*} f^2(x) & \leq 1-(1+f'(x))^2\\ &\leq -2f'(x) - (f'(x))^2 \\ &\leq -2f'(x) \\ & \leq 2\epsilon. \end{align*} Thus $|f(x)| \leq \sqrt{2\epsilon}$. Therefore we know that if $|f(x)| > \sqrt{2\epsilon}$, then $f'(x) <-\epsilon$. For sufficiently large $x$, we must have $|f(x)| > |L|/2=\sqrt{2(|L|^2/8)}$, hence $f'(x) <-|L|^2/8$. This contradicts the fact that $f(x)$ is bounded below. An entirely analogous argument shows that $\lim_{x \to -\infty} f(x)=0$. Monotonicity implies $f=0$.QED

Kyle
  • 6,063
  • Although it should be a cakewalk, I fail to prove that for a $C^1$ function with a limit at infinity, its derivative must have $0$ as limit at $\infty$... – Gabriel Romon Jun 12 '14 at 19:05
  • Have you first shown that the limit of the derivative exists? – Kyle Jun 12 '14 at 19:11
  • I don't think you can prove the existence of the limit for the derivative with usual theorems. The only information about it is boundedness. – Gabriel Romon Jun 12 '14 at 19:14
  • 2
    And now that I remember this, http://math.stackexchange.com/questions/788813/the-limit-of-the-derivative-of-an-increasing-and-bounded-function-is-always-0/788818#788818 this is definitely wrong. – Gabriel Romon Jun 12 '14 at 19:19
  • I was thinking that the constraint $f^2 +(1-f')^2 \leq 1$ would imply that the derivative's limit existed. – Kyle Jun 12 '14 at 19:23
  • 1
    Another thought would be to compare the limits as $x$ approaches positive and negative infinity. There are some easy restrictions there. – Kyle Jun 12 '14 at 19:25
  • I found a counterexample function, see my new answer. – Alexander Vigodner Jun 13 '14 at 05:09
  • It seems for me your pved only that the fact that $f(x)\to 0$ on infinity. But it does not prove that $f(x)=0$ on $R$. Am I wrong? – Alexander Vigodner Jun 13 '14 at 05:41
  • The sequence is decreasing, so that's all that needs to be shown. – Kyle Jun 13 '14 at 13:34
0

I think I have to completely rewrite the solution keeping the wrong one above untouched. As I said before I am sure we can build such function, and I think I did it using the ODE in the above solution in the end. I am building a counterexample function: Fistly, $f(x)= 0$ for $x\le 0$;

Now I'd like to build a simple function satisfying condition $$ f^2+(1+f^\prime)^2\le 1 $$ on some interval $[0,x^*]$.
I define $$ g(t)=-x^3/3-x^2/2\\ g^\prime(x)=-x^2-x $$ I is obvious that in some positive neighborhood $(0,\epsilon)$ $$ g^2+(1+g^\prime)^2 =O(\epsilon^4)+1-O(\epsilon) \le 1 $$ It is obvious also that staring from some $x^*$ $$ g^2+(1+g^\prime)^2\ge 1 $$ Besides we have this point $x^*$ is such that $$ g^2(x^*)+(1+g^\prime(x^*))^2= 1 $$ Let's check condition $g^\prime(x^*)>-1$. It is easy t estimate that $x^*$ is about $0.9$ and that then $g^\prime(x^*)< -1

Now again consider the differential equations $$ f^\prime=-1+\sqrt{1-f^2}\\ f^\prime=-1-\sqrt{1-f^2}\\ $$ and choose the second one in accordance with the sign $g^\prime(x^*)+1$. The solution of this equation with initial condition $f(x^*)=g(x^*)$ will extend our function on $R$. So the final function $f(x)$ is $$ f(x)=0 ~ if ~ x\le 0 \\ f(x)=g(x) ~ if~ 0<x\le x^* \\ solution~ of~ f^\prime=-1-\sqrt{1-f^2}, f(x^*)=g(x^*) ,~ x\ge x*\\ $$

Since I did not find any mistake in the proof that such function cannot exist I again probably made mistake somewhere. But I cannot find it. Any comments please. May be this ODE cannot have a solution on $R$? Lipschitz condition is not satisfied.

-1

FIX Well the solution of differential equation with non zero initial condition will satisfy your property $$ f'=-1-\sqrt{1-f^2} $$ Edit Notice that equation $$ f'=-1+\sqrt{1-f^2} $$ is also OK. Now let's change $\tau = -t$ and rewrite the second equation as function of $\tau$ $$ f_\tau^\prime =1-\sqrt{1-f^2} $$ Let's build now the function on $R^+$ as a solution of the first eqaution and on $R^-$ as a solution of the third equation. To guarantee differentiability in $t=0$ let's make equal derivatives at time $t=\tau=0$. $$ f^\prime_t=-1-\sqrt{1-f^2}=-f^\prime_\tau=-1+\sqrt{1-f^2} $$ So with initial condition $f=1$ we can propagate this function on $R$. I changed signs here. Now the initial derivative is