4

Assume that $\|f\|_p< \infty$ for $1\le p<\infty$. In this question we showed that $$ g(p)=\|f\|_p $$ is continuous in $p \ge 1$. The technique was to use Dominant Convergence theorem.

Using $\varepsilon$-$\delta$ language, what this means is that for any $\varepsilon>0$ there is a $\delta>0$ such that for all $|q-p| < \delta(\varepsilon)$ implies that $$ \left | \|f\|_p-\|f\|_q \right| \le \varepsilon $$

My question the following. Can we characterize $\delta(\varepsilon)$ more explicitly in term of $\varepsilon$ and have an expression for $\delta$?

Observer, that $\delta$ should probably be a function of $p$ as well, otherwise I don't think it is possible.

Boby
  • 5,985

1 Answers1

4

Here's a super-soft answer. Fix a measurable function $f$ such that $f\in L^p$ for all $p\in (p_-, p_+)$ ($p_+$ possibly being $\infty$). Let $$\Phi\left(\frac 1 p\right)=\left[ \int \lvert f\rvert^p\right]^\frac{1}{p}.$$ This function $\Phi$ is log-convex on the interval $\left(\frac1{p_+}, \frac1{p_-}\right)$, meaning that it satisfies the following inequality: $$ \Phi\left( (1-\alpha)\frac1{p_1} + \alpha \frac1{p_2}\right)\le \Phi\left(\frac1{p_1}\right)^{1-\alpha}\Phi\left(\frac{1}{p_2}\right)^{\alpha}$$ where $p_1, p_2\in (p_-, p_+)$ and $\alpha\in [0, 1]$. (This inequality is a consequence of Hölder's inequality and it gives an alternative proof of the continuity of $\|f\|_p$ with respect to $p$).

Now any log-convex function is convex and any convex function is Lipschitz on compact subintervals of its interval of definition (one says that it is locally Lipschitz). So $\Phi$ is locally Lipschitz on $\left(\frac1{p_+}, \frac1{p_-}\right)$, which means that

$$ \left\lvert \|f\|_{L^{p_1}} - \|f\|_{L^{p_2}}\right\rvert \le C_{f, I}\left\lvert \frac1{p_1} -\frac1{p_2}\right\rvert,\qquad \forall p_1, p_2\in I$$ where $I\subset (p_-, p_+)$ is a compact interval.

  • Thanks. One, question can about on $C$ be made explicit? Where would I find it other then the post you gave me? – Boby Oct 05 '15 at 21:36
  • @Boby: Actually, this post is slightly wrong. A convex function $\Phi$ on $I=[a, b]$ is Lipschitz on $I_\epsilon=[a+\epsilon, b-\epsilon]$ with constant given by the maximum of its difference ratio over $I_\epsilon$. This constant might tend to $\infty$ as $\epsilon \to 0$. This basic concept about convex functions is explained very well (IMHO) in the book "Convexity" by Webster (look for the "Three Chords Lemma"). – Giuseppe Negro Oct 05 '15 at 21:40
  • Thanks for the reference, I will definitely check it out. Can you explain what you mean by "maximum of its difference ratio over $I_\epsilon$" ? – Boby Oct 05 '15 at 21:44
  • I mean $$\max \left{\left\lvert \frac{\Phi(p+h)-\Phi(p)}{h}\right\rvert\ :\ p\in I_\epsilon, \ p+h\in I_\epsilon\right}$$ – Giuseppe Negro Oct 05 '15 at 21:48
  • Ok. Easy. Thanks. I will definitely check that book out. – Boby Oct 05 '15 at 21:50
  • @Boby: If $p\mapsto \left[\int \lvert f \rvert^p \right]^{\frac1p}$ is differentiable (which should happen if $f$ is nice enough, but I have not checked very carefully), you can characterize that constant as the maximum of the absolute value of the derivative. HTH – Giuseppe Negro Oct 06 '15 at 00:41