13

Let $f(x)$ be a continuous function on $\mathbb{R}$, such that for any real number $x$ we have: $$\lim_{h\to 0}\dfrac{1}{h^3}\int_{-h}^{h}f(x+t)\cdot t\,dt=0.$$ Show that $f(x)$ is a constant function.

Maybe we can use the following lemma?

Lemma. If $g$ is a continuous function, then $$\lim_{h\to0}\frac{1}{2\,h}\int_{x-h}^{x+h}g(s)\,ds=g(x).$$

Proof. We may assume $h>0$. $$ \left|g(x)-\frac{1}{2\,h}\int_{x-h}^{x+h}g(s)\,ds\right|=\frac{1}{2\,h}\left|\int_{x-h}^{x+h}(g(x)-g(s))\,ds\right|\le\frac{1}{2\,h}\int_{x-h}^{x+h}\left|g(x)-g(s)\right|\,ds.$$ Use that $g$ is continuous at $x$ to show that the last expression converges to $0$ as $h\to0$.

math110
  • 93,304
  • 6
    Please don't change the statement of the question after users have already thought about your problem. It's okay to make a separate question for the case $h^3$. The question with $h^2$ is very nice, and you got a good answer showing why it is false. – Alex Ortiz May 18 '21 at 16:26
  • 4
    @AlexOrtiz: If there is a simple counter example such as $f(x) = x$ then I would suspect a typo, and ask for clarification in a comment. But there are surely different opinions on that. – Martin R May 18 '21 at 19:26
  • 2
    Why are you looking for "an answer from a reputable source"? You already have two answers to your revised statement. – ə̷̶̸͇̘̜́̍͗̂̄︣͟ May 20 '21 at 16:35
  • 2
    it's from china nanjing Nanjing University constant problem – math110 May 21 '21 at 23:01
  • @inequality: Is there any other assumption on $f$? for example, $f$ is lipschitz? – Paresseux Nguyen May 23 '21 at 03:24
  • Taking the Fourier transform of $f$ seems fruitful, but the integrability conditions may be just as difficult to justify as the proposition that $f$ is differentiable. I didn't give it much thought. – Mark May 23 '21 at 20:28
  • 5
    I think that Paresseux Nguyen's answer, which is a wonderful piece of analysis, should be accepted. – Helmut May 28 '21 at 20:12
  • 6
    It is also a pity that the question author did not care to award the bounty. – Martin R May 28 '21 at 20:20
  • Lol, It's the first time I see a bounty of +25. – NN2 Jun 06 '21 at 15:23
  • 6
    @NN2: No, the bounty was for 50 points. But inequality did not award the bounty (or accept an answer). In that case, half of the amount is automatically assigned to the highest voted answer that was posted during the bounty period. – Martin R Jun 06 '21 at 16:22

4 Answers4

18

Define $F (x):= \int_{0}^x f(t)dt$

By integration by parts, we observe that: $$\begin{align}\frac{1}{h}\int_{-h}^h f(x+t)tdt &= \frac{1}{h}\int_{0}^h \big[ f(x+t)-f(x-t)]tdt\\& = F(x+h)+F(x-h) - \frac{1}{h}\int_{0}^h \big[ F(x+t)+F(x-t) \big]dt \end{align}$$ So the provided limit for f implies that for all $x \in \mathbb{R}$ $$T(F)(x)=0$$ Where $T$ is an operator defined as $$T(g)(x)=\lim_{h \rightarrow 0^+} \frac{1}{h^2} \left\{ g(x+h)+g(x-h)-\frac{1}{h}\int_{0}^h \big[ g(x+t)+g(x-t)\big]dt \right\} $$ for any continuous function $g$. Note that $T(g)$ is not necessarily defined on all points in $\mathbb{R}$.

Let $\epsilon$ be any positive number , $(a,b)$ be any interval in $\mathbb{R}$. Define the functions $$G(x)= F(x)-F(a)-\frac{F(b)-F(a)}{b-a}(x-a)+\epsilon(x-a)(x-b)$$ and $$H(x)= F(x)-F(a)-\frac{F(b)-F(a)}{b-a}(x-a)\boldsymbol{-}\epsilon(x-a)(x-b)$$

We prove that $G(x) \le 0$ everywhere on $[a,b]$. If not then, because $G(a)=G(b)=0$, there is a point $c \in (a,b)$ at which $G$ attains its positive maximum. Observe that: $$T(G)(c)= \lim_{h \rightarrow 0^+} \frac{1}{h^2} \left\{ G(c+h)+G(c-h)-\frac{1}{h}\int_{0}^h \big[ G(c+t)+G(c-t)\big]dt \right\} $$ Because $c$ is a maxima of $G$, the function $u(t):=G(c+t)+G(c-t)$ attains its local maximum at $t=0$. So we can choose a strictly decreasing positive sequence $(h_n, n \in \mathbb{N})$ such that $h_n$ converge to $0$ and $h_n \in \text{argmin}_{ t \in [0,h_n]} u(t)$ for all $n$ . Thus, $$ G(c+h_n)+G(c-h_n)-\frac{1}{h_n}\int_{0}^{h_n} \big[ G(c+t)+G(c-t)\big]dt \le 0$$ Thus $T(G)(c) \le 0$. Note that the limit $T(G)(c)$ exists because $T(F)$ exists and in fact, according to the definition, we see that: $$T(G)(c)=0+0+\frac{4}{3}\epsilon>0 $$ Which is a contradiction to what we have just proven.

So $G(x) \le 0$ on $[a,b]$. Similarly, we can prove that $H(x) \ge 0$ everywhere on $[a,b]$. These give: $$ \left| F(x)-F(a)-\frac{F(b)-F(a)}{b-a}(x-a) \right| \le \epsilon (b-a)^2$$ for all $\epsilon>0$ and $x \in [a,b]$. Hence forth the conclusion.

Side note: The hypothesis is fairly tight as we can show that the Cantor function satisfies the condition on an almost everywhere set of $\mathbb{R}$ even though it is not constant.

  • Your argument is not entirely clear to me. $u$ has a maximum at $t=0$, but why does that imply that $u(h_n) - \frac 1 {h_n} \int_0^{h_n} u(t) dt \le 0$ on a sequence $h_n \searrow 0$? – And $h_n \in \text{argmin}_{ t \in [0,h_n]} u(t)$ looks like a cyclic definition to me. – Martin R May 24 '21 at 13:32
  • @MartinR The latter part is where I started. That existence of the sequence $(h_n)$ is can be shown by the following construction. $$\begin{align} h_1 &\in \text{argmin}{t \in [0,b-a]}\{0} u(t) \ h{n+1} &\in \text{argmin}_{ t \in [0, \text{min}(h_n,1/n)] } u(t) \end{align} {0}$$ – Paresseux Nguyen May 24 '21 at 13:39
  • That is not exactly what you wrote in the answer. – And how do you get from that definition to $u(h_n) - \frac 1 {h_n} \int_0^{h_n} u(t) dt \le 0$? Perhaps I am overlooking something simple ... – Martin R May 24 '21 at 13:44
  • That is beause $u(h_n)=\text{min}{ t \in [0,h_n]} u(t)$, in other words, $ u(h_n) \le u(t) \forall 0 \le t \le h_n$. So $u(h_n) \le \frac{1}{h_n} \int{0}^{h_n} u(t)dt$ – Paresseux Nguyen May 24 '21 at 13:46
  • (I repost this comment because I was not allowed to edit this one). The latter part is where I started. That existence of the sequence (ℎ) is can be shown by the following construction. $$\begin{align} h_1 &\in [\text{argmin}{t \in [0,b-a]} u(t)]\backslash { 0} \ h{n+1} &\in [\text{argmin}_{ t \in [0, \text{min}(h_n,1/n)] } u(t)]\backslash { 0} \end{align} $$ – Paresseux Nguyen May 24 '21 at 13:48
  • With that definition is $u(h_{n+1})=\text{min}{ t \in [0,h_n]} u(t)$, which gives you “only” $u(h{n+1}) \le \frac{1}{h_n} \int_{0}^{h_n} u(t)dt$ – Martin R May 24 '21 at 13:50
  • If $u(h_{n+1}) =\text{min}{t \in [0,h_n] } u(t)$, then $u(h{n+1}) =\text{min}{t \in [0,h{n+1}] } u(t)$ because $0 \le h_{n+1} \le h_n$ – Paresseux Nguyen May 24 '21 at 13:52
  • 1
    @ParesseuxNguyen: Now I see it, excellent! – Martin R May 24 '21 at 14:42
10

Prove me wrong, but I think the statement is false. Consider the example $f(x)=x$, which is clearly continuous and fulfills. For any $x\in\mathbb{R}$, \begin{gather*} \lim_{h\to0}\frac1{h^2}\int_{-h}^hf(x+t)t\,\mathrm dt=\lim_{h\to0}\frac1{h^2}\int_{-h}^h(x+t)t\,\mathrm dt=x\lim_{h\to0}\frac1{h^2}\int_{-h}^ht\,\mathrm dt+\lim_{h\to0}\frac1{h^2}\int_{-h}^ht^2\,\mathrm dt\\ = x \lim_{h\to0} \frac{1}{h^2}\left(\frac{h^2}{2}-\frac{h^2}{2}\right) + \lim_{h\to 0}\frac{1}{h^2}\left(\frac{h^3}{3} - \frac{-h^3}{3}\right) = \lim_{h\to 0 } \frac{2}{3} h = 0. \end{gather*}

Ѕᴀᴀᴅ
  • 34,263
crankk
  • 1,439
4

I can provide a straightforward answer, if $f$ is additionally differentiable. Then, there holds

$$\int_{-h}^hf(x+t)tdt = \int_{-h}^h (f(x)+f'(x)t+o(t))t dt = f(x)\left(\frac{h^2}{2}-\frac{h^2}{2}\right) + f'(x)\left(\frac{h^3}{3}-\frac{-h^3}{3}\right) + o(h^3) = f'(x) \frac{2}{3} h^3 + o(h^3).$$

Consequently,

$$0=\lim_{h\to0}\frac{1}{h^3}\int_{-h}^hf(x+t)tdt = \frac{2}{3} f'(x),$$

thus $f$ is constant. The general case can maybe treated with an argumentation similar to that of Marco Cantarini.

crankk
  • 1,439
  • 4
    I have the feeling that it is considerably more difficult if $f$ is not assumed to be differentiable. Marco Cantarini's argument had a flaw, he has deleted his answer in the meantime. – Martin R May 23 '21 at 10:04
  • polynomials are dense in the space of continuous functions $C([-1,1])$, and they are smooth, so using this probably you can prove the original result – Masacroso Oct 23 '22 at 15:33
-2

If $f(x)$ is continuous in $\textbf{R}$, then we can write $$ \int^{h}_{0}f(x+t)tdt-\int^{-h}_{0}f(x+t)tdt=\int^{h}_{0}(f(x+t)-f(x-t))tdt. $$ Hence from remark below (and for all $x$ real) we have $$ 0=\lim_{h\rightarrow 0}\frac{1}{h^3}\int^{h}_{0}(f(x+t)-f(x-t))tdt=\lim_{h\rightarrow 0}\frac{f(x+h)-f(x-h)}{3h}.\tag 1 $$ Clearly about (1) we have: $$ \lim_{h\rightarrow 0}h^{-1}\int^{h}_{0}(f(x+t)-f(x-t))tdt=0 $$ Set $$ F(y):=y^{-1}\int^{y}_{0}(f(x+t)-f(x-t))tdt, $$ then $F(y)$ is continuous and differentiatable in $\textbf{R}$ with $F(0)=0$ and $F'(0)=0$. This last follows from the fact that $F(y)$ is differentiable in all $\textbf{R}-\{0\}$ and from problem condition: $$ \lim_{y\rightarrow 0}\frac{F(y)}{y}=0. $$ Also: $$ \lim_{y\rightarrow 0}F'(y)=\lim_{y\rightarrow 0}\left(f(x+y)-f(x-y)-\frac{F(y)}{y}\right)=0. $$ All that since $$ \lim_{h\rightarrow 0}\frac{1}{h^3}\int^{h}_{0}\left(f(x+t)-f(x-t)\right)tdt=0.\tag 2 $$ Using the condition of the problem and Cauchy's mean value theorem (see below) we have $$ 0=\lim_{y\rightarrow 0^{+}}\frac{F(y)}{y^2}=\lim_{y\rightarrow 0^{+}}\frac{F'(y)}{2y}= $$ $$ =\lim_{y\rightarrow 0^{+}}\left(\frac{f(x+y)-f(x-y)}{2y}-\frac{1}{2y^3}\int^{y}_{0}(f(x+t)-f(x-t))tdt\right)= $$ $$ =\lim_{y\rightarrow 0^{+}}\frac{f(x+y)-f(x-y)}{2y}=(f_{s})'_{+}(x). $$ Also in the same way $$ \lim_{y\rightarrow 0^{-}}\frac{f(x+y)-f(x-y)}{2y}=(f_{s})'_{-}(x). $$ and $(f_s)'_{-}(x)=(f_s)'_{+}(x)=f_s'(x)=\lim_{y\rightarrow 0}\frac{f(x+y)-f(x-y)}{2y}$ exists. However there exists the

Theorem.

I) If for a function $f(x)$ the left $f'_{-}(x)$ and right $f'_{+}(x)$ derivatives exists, then $f'_s(x)$ exist and $f'_s(x)=\frac{1}{2}(f'_{-}(x)+f'_{+}(x))$.

II) If $f_s'(x)$ exist then it is not necessary that $f'(x)$ exist.

Hence we arrive to the result that $$ f'_s(x)=0\textrm{, }\forall x\in\textbf{R}.\tag 3 $$ However (3) is not so well established since (5) is not applied to functions that change sign infinite times in a neighborhood of $0$. For example $$ F(y)=y^3\sin\left(\frac{1}{y}\right). $$ The true argument L'Hospital is that $$ \textrm{liminf}_{h\rightarrow 0}\frac{f(x+h)-f(x-h)}{2h}\leq 0\textrm{ and }\textrm{limsup}_{h\rightarrow 0}\frac{f(x+h)-f(x-h)}{2h}\geq 0.\tag 4 $$

Notes.

From Cauchy mean value theorem in $[0,y]$, we have that exists always $\xi=\xi(y)$: $0<\xi(y)<y$ such that $$ \frac{F(y)-F(0)}{y^2-0}=\frac{F'(\xi)}{2\xi}. $$ Hence when $y\rightarrow 0^{+}$, from the continuity of $$ \frac{F'(t)}{2t}=\frac{f(x+t)-f(x-t)}{2t}-\frac{1}{2t^3}\int^{t}_{0}(f(x+t_1)-f(x-t_1))t_1dt_1, $$ in $(0,y)$ and knowing that $\lim_{y\rightarrow 0^{+}}\frac{F(y)}{y^2}=0$, $\lim_{y\rightarrow 0^{+}}\xi(y)=0$ we have $$ 0=\lim_{y\rightarrow 0^{+}}\frac{F(y)}{y^2}=\lim_{y\rightarrow 0^{+}}\frac{F'(\xi(y))}{2\xi(y)}=\lim_{t\rightarrow 0^{+}}\frac{F'(t)}{2t}.\tag 5 $$ The same way is used when $y\rightarrow0^{-}$

  • 5
    It seems to me that you are applying L'Hospital's rule in the wrong direction: You argue that because the limit $\lim_{h \to 0} \frac{F(h)}{G(h)} $ exists, the limit $ \lim_{h \to 0} \frac{F'(h)}{G'(h)}$ exists as well. But L'Hospital's rule is the other way around. – Martin R May 23 '21 at 16:54
  • I think this is ok since it is given from the question. – Nikos Bagis May 23 '21 at 16:59
  • 1
    Please explain how you get $\lim_{h\to 0}\frac{f(x+h)-f(x-h)}{2h} = 0$. – Martin R May 23 '21 at 17:23
  • 3
    ... and even if that were true, the next step is wrong because suddenly you assume that the limit is zero uniformly in $x$. – Martin R May 23 '21 at 17:29
  • 3
    @NikosBagis: The use of L'Hospital is not correct here, as Martin R pointed out, unless one assumes that $f$ is symmetrically differentiable everywhere. In that case, it is a deep result that $f$ found be constant (Theorem 5.2). In the generality of the OP ($f$ is simply continuous) some other argument may need to be used. – Mittens May 23 '21 at 21:37
  • @OliverDiaz As by the extended mean value theorem, for every positive $h$, there exists $y\in]0,h[$ such that $h^{-3}\int_{-h}^h f(x+t)tdt=\frac1y(f(x+y)-f(x-y))$, Paresseux Nguyen's answer actually also provides a proof that any $f$ symmetrically differentiable everywhere with derivative 0 must be constant. – Helmut May 30 '21 at 20:32
  • @Helmut: I've no problem with Parasseaux's solution. My critique was the erroneous used of L'Hospital's rule by other respodders. – Mittens May 30 '21 at 20:35
  • @OliverDiaz Sorry, I was not clear enough. I meant to say that the deep result you mention is actually proved by Paresseux Nguyen's solution and I find it elementary (but ingenious). – Helmut May 30 '21 at 23:56
  • @Helmut, I think the result I am referring to is more complicated. But it is irrelevant for the setting go the OP. – Mittens May 31 '21 at 00:01