0

Let $f,g:X\to \mathbb{R}$ be integrable function on measure space $(X,\mathcal{F}, \mu)$. Define $v=(f,g)$ and $\int vd\mu=(\int fd\mu, \int gd\mu)$. Define $2$-norm $\|(f,g)\|_2=\sqrt{f^2+g^2}$. Show that $$ \|\int vd\mu\|_2\le\int\|v\|_2d\mu $$


My idea is to define a orthogonal transformation that $$F(v)=F(f,g):=(\|v\|_2,0)$$

Then $\|v\|_2=\|Fv\|_2$.

Also, $$ \|F(\int v)\|_2=\|\int v\|_2 $$ because $F\int v=(\|\int v\|_2, 0,0)$.

Since $2$-norm is convex by Why is every $p$-norm convex?, apply Jensen's inequality $$ \|F\int v\|_2=\|\int v\|_2\le \int \|v\|_2 $$

My question is, I used Jensen's inequality here, but if I directly apply Jensen's inequality, wouldn't that directly prove it? I feel like I might be engaging in circular reasoning.

Hermi
  • 1,480
  • Every norm is convex in the relevant sense, and you can certainly apply Jensen’s inequality. You need to properly normalize your measure to a probability measure first, but other than that I don’t see any problem with it. – David Gao Mar 12 '24 at 18:30
  • See also my comment under this question: https://math.stackexchange.com/questions/4877969/how-can-i-prove-that-vert-int-xfd-mu-vert-leq-int-x-vert-f-vert-d-mu#comment10402578_4877969 – David Gao Mar 12 '24 at 18:31
  • @DavidGao We only learned the one dimensional Jensen's inequality. So I assume that we only allow to use this one. Can we prove that by classical Jensen's inequality? – Hermi Mar 12 '24 at 19:06
  • Using Jensen’s inequality in this specific question seems to require the 2-d version of it. Is that not allowed? – David Gao Mar 12 '24 at 19:34
  • If that is not allowed, my suggestion is just to prove the result for simple functions (which is just triangle inequality) and then approximate arbitrary functions using simple functions, the same strategy I suggested in comments of the question I linked to. I don’t know how to apply the 1-d Jensen’s inequality to this. – David Gao Mar 12 '24 at 19:39
  • @DavidGao So I mean if we need to apply 2-d version we need to prove it. – Hermi Mar 12 '24 at 21:10
  • @DavidGao Do you think using the property of $L_2$ norm is rotation invariant useful? – Hermi Mar 12 '24 at 21:11
  • @DavidGao Also, how to "normalize your measure to a probability measure first"? – Hermi Mar 12 '24 at 21:28
  • If you want to prove the 2-d Jensen’s inequality, it’s the same proof, just use the existence of subdifferentials. I feel like it’s more trouble than just saying approximation by simple functions though, but that’s more personal preference. – David Gao Mar 12 '24 at 21:54
  • I’m not sure how would rotation invariance help you. I guess you can try rotating $v$ to a fixed 1-d subspace? But that wouldn’t be a constant rotation, so you can’t pull it out of the integral anyway. – David Gao Mar 12 '24 at 21:56
  • Jensen’s inequality only works if you have a probability space, and your question refers to a general measure space. So you would need normalize your measure to a probability measure. If $\mu$ is finite, just divide by $\mu(X)$. Otherwise, if $\mu$ is only $\sigma$-finite, you need to perform an infinite sum of the finite parts. – David Gao Mar 12 '24 at 21:59
  • @DavidGao So there only two cases for $\mu$? Also, if $\mu$ is $\sigma$-finite, how to normalize it? For $\mu$ finite, we can just define $\nu(A)=\mu(A)/\mu(X)$? So $\nu$ is now a probability measure. – Hermi Mar 12 '24 at 22:14
  • For $\sigma$-finite measures, the measure space is a countable disjoint union of finite measure spaces. The integrals in your question can then be turned into an infinite sum of integrals over finite measure spaces. Use the result for finite measure spaces and triangle inequality to proceed after that. This should be enough hint for you to fill in the details. – David Gao Mar 12 '24 at 22:39
  • @DavidGao That's ok. I just found the Jensen's inequality holds for measure space on $\mathbb{R}$. See Theorem 2.2 on page 44 in https://cas.nvsu.edu.ph/e-library/books/undergrad/bsmath/Linear%20Programming/Real%20Analysis/07.(Elliott-H.-Lieb,-Michael-Loss)-Analysis.pdf. – Hermi Mar 12 '24 at 22:46
  • @DavidGao Can you please write a proof using simple function approximation? – Hermi Mar 13 '24 at 05:46

1 Answers1

1

Assume first that $v$ is a simple function, that is, $v = \sum_{i=1}^n (f_i, g_i) \chi_{A_i}$ for some disjoint $A_1, \cdots, A_n \in \mathcal{F}$ with finite measures and $f_1, \cdots, f_n, g_1, \cdots, g_n \in \mathbb{R}$. Then,

$$\begin{split} \|\int v \, d\mu\| &= \|\sum_{i=1}^n (f_i, g_i) \mu(A_i)\|\\ &\leq \sum_{i=1}^n \mu(A_i) \|(f_i, g_i)\|\\ &= \int [\sum_{i=1}^n \|(f_i, g_i)\| \chi_{A_i}] \, d\mu\\ &= \int \|v\| \, d\mu \end{split}$$

Where the inequality on the second line uses the triangle inequality. So, the result is true for simple functions. In the general case, let $v_n = (f_n, g_n)$ be a sequence of simple functions converging to $v = (f, g)$ pointwise and with $\|v_n\| \leq \|v\|$ pointwise. By dominated convergence theorem, $\int v_n \, d\mu \to \int v \, d\mu$. Similarly, $\|v_n\| \to \|v\|$ pointwise, so again by dominated convergence, $\int \|v_n\| \, d\mu \to \int \|v\| \, d\mu$. Since $\|\int v_n \, d\mu\| \leq \int \|v_n\| \, d\mu$ for all $n$ as $v_n$ are simple, we obtain the desired result by taking limits of both sides as $n \to \infty$.

Note: this proof does not use any property of the $2$-norm apart from the fact that it is a norm. So, the result is still valid if you change the $2$-norm to any other norm on $\mathbb{R}^2$. In the same vein, the result is true not just for dimension $2$, but will also hold for any Banach space.

David Gao
  • 4,432
  • 1
  • 10
  • 24
  • I am confused about your last line: why $\sum |(f_i, g_i)|2\chi{A_i}=|v|2$ holds? I believe that $|v|_2=|\sum (f_i, g_i)\chi{A_i}|2$. So it seems that $|\sum (f_i, g_i)\chi{A_i}|2\le \sum|(f_i, g_i)\chi{A_i}|$, right? – Hermi Mar 13 '24 at 20:48
  • For the second part, here we assume that $v_n\to v$ pointwise. But how can we have $|v_n|_2\to |v|$? It seems this $L_2$ convergence is stronger? – Hermi Mar 13 '24 at 20:57
  • @Hermi Minkowski inequality is just the triangle inequality for $L^p$ spaces, so yeah. – David Gao Mar 13 '24 at 20:57
  • 1
    @Hermi $A_i$ are disjoint and you’re applying the norm pointwise. So on any $A_i$, $v$ takes value $(f_i, g_i)$ so $|v|$ takes value $|(f_i, g_i)|$. – David Gao Mar 13 '24 at 20:58
  • 1
    @Hermi Why should it be? Your $2$-norm is applied pointwise on $\mathbb{R}^2$. This is just saying if a sequence $x_n \to x$ in $\mathbb{R}^2$, then $|x_n| \to |x|$. – David Gao Mar 13 '24 at 21:00
  • So we already have $|v|_2$ is integrable? Otherwise $\int |v|=\infty$, we are done. – Hermi Mar 13 '24 at 21:04
  • @Hermi You are assuming $f, g$ are integrable. Since $|v| \leq |f| + |g|$, we do have $|v|$ is integrable. – David Gao Mar 13 '24 at 21:08
  • It seems that $|v|_2=(|f|^2+|g|^2)^{1/2}\le 2^{1/2}\max{|f|, |g|}$? – Hermi Mar 13 '24 at 21:09
  • 1
    @Hermi You can also do that. It doesn’t really matter. This is just saying the $p$-norms on $\mathbb{R}^2$ are all equivalent. – David Gao Mar 13 '24 at 21:14