1

In the book `Measures, Integrals and Martingales' of René L. Schilling, pages 125-126, Jensen's inequality is stated in the following way:

"Recall that a function $V:[a,b]\rightarrow\mathbb R$ on an interval $[a,b]\subset\overline{\mathbb R}$ is $convex$ if $$V(tx+(1-t)y)\leq tV(x)+(1-t)V(y),\qquad0<t<1,$$ holds for all $x,y\in[a,b]$.

$\dots$

If $V:[0,\infty)\rightarrow\mathbb R$ is convex, then the extension $V:[0,\infty]\rightarrow(-\infty,\infty]$, where $V(\infty):=+\infty$ is again convex.

Theorem 13.13 (Jensen's inequality) Let $(X,\mathcal{A},\mu)$ be a measure space and $\mu$ a probability measure.

(i) Let $V:[0,\infty)\rightarrow[0,\infty)$ be a convex function and extend it to $[0,\infty]$ as above: $$V\left(\int ud\mu\right)\leq\int V(u)d\mu\quad\forall u\in\mathcal{M}(\mathcal{A}),u\geq0.$$ In particular, if $V(u)\in\mathcal{L}^1(\mu)$, $u\geq0$, then $u\in\mathcal{L}^1(\mu)$.

$\dots$

"

Now the following case seems to violate this statement:

Consider the following measure space: $X:=(0,1]$, $\mathcal{A}:=\mathcal{B}((0,1])$, $\mu:=\lambda^1$. Also, define $u(x):=\frac{1}{x^2}$ for all $x\in(0,1]$. Now $u$ is measurable.

Let $V(x):=e^{-x}$ for $x\in[0,\infty)$, and let $V(\infty):=\infty$. Then $V$ is convex.

This seems to satisfy the requirements of Jensen's inequality. However, we get $$V\left(\int ud\mu\right)=V\left(\int_0^1\frac{1}{x^2}dx\right)=V(\infty)=\infty$$ while $$\int V(u)d\mu=\int_0^1 e^{-\frac{1}{x^2}}dx<\int_0^11dx=1.$$ Is there a mistake in my reasoning? Or is there a mistake in the book? Of course, the function $u$ is not in $\mathcal L^1(\mu)$ in this case. However, this is not a requirement in Theorem 13.13.

Riemann
  • 783
  • Convex implies continuous, but the extension of $V$ is not continuous. Theorem does not apply. – AlvinL Nov 08 '21 at 14:32
  • The assertion "If $V:[0,\infty)\rightarrow\mathbb R$ is convex, then the extension $V:[0,\infty]\rightarrow(-\infty,\infty]$, where $V(\infty):=+\infty$ is again convex." is simply false, as the example shows. – David C. Ullrich Nov 08 '21 at 14:45
  • Also the claim $V(u) \in L^1$ implies $u\in L^1$ is wrong: Take $V\equiv0$. – daw Nov 08 '21 at 16:00
  • 1
    Not sure how this is proven in the book, but the standard proof using subgradients breaks down if $\int u\ d\mu=+\infty$. – daw Nov 08 '21 at 16:08

1 Answers1

4

The result is wrong as stated: Just take $V\equiv 0$ and non-integrable $u$.

But it can be repaired in the following way.

Assume that $V:[0,+\infty) \to [0,+\infty]$ is convex. Set $V(+\infty) = \lim_{x\to\infty} V(x) \in [-\infty,+\infty]$. Then for all measurable $u\ge0$. $$ V(\int u \ d \mu) \le \int V(u) \ d \mu. $$ In addition, if $V(+\infty)=+\infty$ then $V(u) \in L^1(\mu)$ implies $u\in L^1(\mu)$.

We only have to look into the case $\int u \ d \mu=+\infty$.

If $V(+\infty) < +\infty$ then $V(+\infty) = \inf_{x\in \mathbb R} V(x)$ by convexity, and the Jensen's inequality is fulfilled trivially.

Now assume $V(+\infty)=+\infty$. Then there are $x_1<x_2$ such that $V(x_1) < V(x_2) <+\infty$. Then the line through $(x_1,V(x_1))$ and $(x_2,V(x_2))$ is below the graph of $V$ for all $x>x_2$. And there are $a>0$ and $b$ such that $V(x)\ge \max(0, ax+b)$ for all $x$. Then $$ \int V(u) \ d \mu \ge \int_{\{x: \ u(x) > -b/a\}} V(u(x)) \ d\mu(x) \ge \int_{\{x: \ u(x) > -b/a\}} (au(x)+b) \ d\mu(x). $$ Since $\int u \ d\mu=+\infty$, we have $$ +\infty = a\int u\ d\mu = \int_{\{x: \ u(x) > -b/a\}} (au(x)+b) \ d\mu(x) + \underbrace{ \int_{\{x: \ u(x) \le -b/a\}} (au(x)+b) \ d\mu(x)}_{ \in \mathbb R}. $$ As $\mu$ is finite, it follows $\frac1a \int_{\{x: \ u(x) > -b/a\}} au(x)+b \ d\mu(x)=+\infty$, and which implies $\int V(u) \ d\mu=+\infty$, and the inequality is true.

daw
  • 49,113
  • 2
  • 38
  • 76