1

enter image description here

I understand the first two sentences of the proof, however I cannot see how the third and final sentence holds. Why should $\mu(f \leq a)=0$? Surely it should be non-zero as c is defined as the infimum that $\mu(.)=0$? I equally don't understand how $\mu(f \geq a)$ should equal $0$ if $\mu(f \leq a)=0$.

Alp Uzman
  • 10,742
Trajan
  • 5,194
  • 2
  • 27
  • 71

1 Answers1

2

There are two considerations:

Suppose $a < c$. Then $\mu(f > a) > 0$ by definition of $c$. Since $\{f \leq a\} = \{f > a\}^c$ it follows from the first sentence that $\mu(f \leq a) =0$. Therefore $f \gt a$ almost everywhere. It follows that $f \geq c$ almost everywhere since $a < c$ was arbitrary.

On the other hand, suppose $a > c$. Then $\mu(f > a) = 0$ by definition of $c$ and it follows that $f \leq a$ almost everywhere. Therefore $f \leq c$.