6

Let $(\Omega, \mathscr{F}, P)$ be a probability space and $\{X_i, i \in I\}\subset \mathbb{L}^0(\mathscr{F})$, where $I$ can be uncountable.

I want to prove that there exists unique (in P-a.s. sense) essential supremum $X := \text{ess sup}_{i\in I}^PX_i$ in the following sense:

  • $X \in \mathbb{L}^0(\mathscr{F})$
  • $X\geq X_i, P-a.s. , \forall i \in I$
  • If another $\tilde{X}$ satisfies the above two properties, then $\tilde{X}\geq X, P-a.s.$

I know the definition for essential supremum of a function, which is in fact a number. But here we have a sequence of functions, where the essential supremum is itself a function. How are these definitions connected to each other? Any suggestions how I can solve this problem? $\;$

  • A slight variant of the argument here (http://math.stackexchange.com/questions/1361495/existence-of-minimal-measurable-majorant-of-an-arbitrary-function-fx-rightarro) should show what you want. – PhoemueX Sep 15 '15 at 21:27

4 Answers4

4

Below, I originally claimed that Zorn's Lemma is necessary for this proof, but this is not the case. See Do we need Zorn lemma in the proof that every subset of measurable functions has an essential supremum? for a prof, or this blog post.


John Dawkin's proof is almost correct, but you need to can use Zorn's Lemma to make some of the details work. Just like in his proof, we can assume the $X_i$ are bounded by replacing them with $\arctan X_i$.

Let $\mathcal P$ be the collection of random variables satisfying your first two properties. Namely, $\mathcal P=\{X\in \mathbb L_0(\mathcal F):X\ge X_i\,\,a.s.\forall i\in I\}$. This is a partially ordered set, where $Y\ge Z$ if $Y\ge Z$ almost surely.

We will use Zorn's Lemma to show that $\mathcal P$ has a minimal element. To do this, we must show that any chain $\mathcal C\subset\mathcal P$ has a lower bound in $\mathcal P$. Given such a chain, let $\alpha=\inf_{Y\in \mathcal C} EY$, and let $Y_{n}$ be a sequence where $EY_{n}\downarrow\alpha$. Letting $Y=\inf_n Y_{n}$, I claim that $Y$ is a lower bound for $\mathcal C$. Given any $Z\in \mathcal C$, there are two cases.

  • If $Z\ge Y_n$ for some $n$, then clearly $Z\ge \inf Y_n=Y.$

  • If $Z\le Y_n$ for all $n$, then $Z\le Y$ as well, so $\alpha\le EZ\le EY=\alpha$, so $EZ=\alpha$. By the bounded convergence theorem, $E[Y-Z]=\lim_nE[Y_n-Z]=\lim_n EY_n-EZ=\alpha-\alpha=0$, which means that $Z=Y$ a.s, so $Z\ge Y$.

We have shown $Z\ge Y$ for all $Z\in \mathcal C$, so every chain has a lowed bound.

Finally, using Zorn's Lemma, we can conclude that $\mathcal P$ has a minimal element, $X$, which clearly satisfies all three of your properties.

Mike Earnest
  • 75,930
1

Let $g(x):=\arctan(x)$ and consider the real number $\beta:=\sup_{i\in I}E[g(X_i)]$. Choose a sequence $\{i_n\}$ of elements of $I$ such $E[g(X_{i_n})]$ is monotone non-decreasing and $\beta=\lim_n E[g(X_{i_n})]$. The random variable $X:=\sup_n X_{i_n}$ has the properties of essential supremum that you seek. The first property should be clear. Also, $g(X)\ge g(X_{i_n})$ for each $n$, so $\beta\ge E[g(X)]\ge E[g(X_{i_n})]$ for each $n$, hence $E[g(X)]=\beta$.

Now let $i$ be any fixed element of $I$. Define $X':=\max(X,X_i)$. Then $$ \beta\ge E[g(X')]\ge E[g(X)]=\beta, $$ and so $E[g(X')]=E[g(X)]$. But $X'\ge X$, so $g(X')\ge g(X)$, so $g(X')=g(X)$ a.s, and finally $X'=X$ a.s. This final equality amounts to the statement that $X\ge X_i$ a.s. As $i\in I$ was arbitrary, this proves that $X$ has the second required property. The third property follows in a similar way.

John Dawkins
  • 25,733
  • @ John, thanks for your answer. To make sure that I completely understood your solution, the point of using $g(X)$ was just to have an increasing function that makes sure the expected value is bounded, right? So, if you do not use such function, where does your solution fail? – Mehdi Jafarnia Jahromi Sep 16 '15 at 05:09
  • Yes, transforming by $g$ ensures that all the expectations are finite and that $\beta$ is a finite number. – John Dawkins Sep 16 '15 at 16:17
  • In your fourth line, you say $\beta\ge E[g(X)]\ge E[g(X_{i_n})]$. How do you know $\beta\ge E[g(X)]$? This seems equivalent to saying $\sup_i EY_i\ge E(\sup_i Y_i)$, which is not true in general. – Mike Earnest Sep 17 '15 at 15:31
0

Let $E=[0,1]$, let $\Gamma\subset E$ be the Vitali set (which is uncountable), and for each $\gamma\in\Gamma$, let $$f_{\gamma}(t)=\mathbf{1}_{\{\gamma\}}(t)=\begin{cases}1 &\text{if }t=\gamma,\\ 0 &\text{if }t\neq \gamma. \end{cases}$$

Then $\sup_{\gamma}f_\gamma=\mathbf{1}_{\Gamma}$, which is not a measurable function because $\Gamma$ is not a measurable set.

0

Let $\Omega=[0,1]$, let $A\subset \Omega$ be the Vitali set (which is uncountable), and for each $a\in A$, let $$f_{a}(t)=\mathbf{1}_{\{a\}}(t)=\begin{cases}1 &\text{if }t=a,\\ 0 &\text{if }t\neq a. \end{cases}$$

Then $\sup_{a}f_a=\mathbf{1}_{A}$, which is not a measurable function because $A$ is not a measurable set.