32

My question is how to interpret sigma algebra, especially in the context of probability theory (stochastic processes included). I would like to know if there is some clear and general way to interpret sigma algebra, which can unify various ways of saying it as history, future, collection of information, size/likelihood-measurable etc?

Specifically,I hope to know how to interpret the following in some consistent way:

  • being given/conditional on a sigma algebra
  • a subset being measurable or nonmeasurable w.r.t. a sigma algebra
  • a mapping being measurable or nonmeasurable w.r.t. a sigma algebra in domain and another sigma algebra in codomain
  • a collection of increasing sigma algebras, i.e. a filtration of sigma algebras
  • ...

Following are a list of examples that I have met. They are nice examples, but I feel their ways of interpretation are not clear and consistent enough for me to apply in practice. Even if there is no unified way to interpret all the examples, I would like to know what some different ways of interpretation are.

  1. Stopping time

    Let $(I, \leq)$ be an ordered index set, and let $(\Omega, \mathcal{F},\mathcal{F}_t, \mathbb{P})$ be a filtered probability space.

    Then a random variable $\tau : \Omega \to I$ is called a stopping time if $\{ \tau \leq t \} \in \mathcal{F}_{t} \forall t \in I$.

    Speaking concretely, for τ to be a stopping time, it should be possible to decide whether or not $\{ \tau \leq t \}$ has occurred on the basis of the knowledge of $\mathcal{F}_t$, i.e., event $\{ \tau \leq t \}$ is $\mathcal{F}_t$-measurable.

    I was still wondering how exactly to "decide whether or not $\{ \tau \leq t \}$ has occurred on the basis of the knowledge of $\mathcal{F}_t$, i.e., event $\{ \tau \leq t \}$ is $\mathcal{F}_t$-measurable."

  2. Martingale process

    If a stochastic process $Y : T \times \Omega \rightarrow S$ is a martingale with respect to a filtration $\{ \Sigma_t\}$ and probability measure $P$, then for all s and t with $s < t$ and all $F \in \Sigma_s$, $$Y_s = \mathbf{E}_{\mathbf{P}} ( Y_t | \Sigma_s ),$$

    where $\Sigma_s $ is interpreted as "history".

    I was also wondering how $\Sigma_s, s < t$ can act as history, $\Sigma_s, s=t$ as present, and $\Sigma_s, s > t$ as future?

  3. I originally interpret a measurable subset wrt a sigma algebra as a subset whose "size"/"likelihood" is measurable, and the class of such size-measurable subsets must be closed under complement and countable union.
  4. In a post by Nate Eldredge, a measurable subset wrt a sigma algebra is interpreted by analogy of questions being answered:

    If I know the answer to a question $A$, then I also know the answer to its negation, which corresponds to the set $A^c$ (e.g. "Is the dodo not-extinct?"). So any information that is enough to answer question $A$ is also enough to answer question $A^c$. Thus $\mathcal{F}$ should be closed under taking complements. Likewise, if I know the answer to questions $A,B$, I also know the answer to their disjunction $A \cup B$ ("Are either the dodo or the elephant extinct?"), so $\mathcal{F}$ must also be closed under (finite) unions. Countable unions require more of a stretch, but imagine asking an infinite sequence of questions "converging" on a final question. ("Can elephants live to be 90? Can they live to be 99? Can they live to be 99.9?" In the end, I know whether elephants can live to be 100.)

Thanks in advance for sharing your views, and any reference that has related discussion is also appreciated!

Tim
  • 47,382
  • 1
    Elements of the sigma-algebra are events, and for an element A, P(A) is the probability that event A occurs. What more is there to say? – Qiaochu Yuan Feb 26 '11 at 00:55
  • 4
    Such as how to interpret: being given/conditional on a sigma algebra, a subset being measurable w.r.t. a sigma algebra, a mapping being measurable w.r.t. a sigma algebra in domain and another sigma algebra in codomain, ... – Tim Feb 26 '11 at 01:01
  • Tim: 3 is clear (but it is better to interpret the elements in the $\sigma$-albegra as events, as suggested by Qiaochu), while what you are wondering about in 1 and 2 are a little vague... BTW, which book are you using to study probabiilty? – Morning Feb 26 '11 at 15:07
  • @Morning: Thanks! I mostly use Wikipedia or whatever I can find via Google to get a big picture and details. I have access to quite a few books, and chances are if you name it then I can have it. But I feel intimidated to read them all, and to read which is a big question to me. – Tim Feb 26 '11 at 15:24
  • 1
    @Tim: I would recommend to follow a textbook instead of Wikipedia. In this case, you would have much less problems of consistency of definitions (most of the time, terminologies). A Probability Path by Resnick is a good one and not too mathy. If you are confident at math, then try Probability: Theory and Examples (http://www.math.duke.edu/~rtd/PTE/pte.html) by Rick Durrett. – Morning Feb 26 '11 at 15:34
  • Wikipedia math pages should all have a big banner at the top making it explicit that using them to actually learn some subject is a terrible, terrible idea! – Mariano Suárez-Álvarez Feb 26 '11 at 16:52

3 Answers3

32

Gambling is a good starting-point for probability. We can treat $\sigma$-field as a structure of events as we need to define the addition and multiplication for numbers. The completeness of the real numbers is suitable for our calculations, and $\sigma$-field plays the same role.

I hope the following gambling example helps you to understand the filtration and conditional expectation.

Assuming that two people, say player A and player B, bet on the results of two coin tosses. H: head T: tail

At the time $0$, A and B do not know anything about the result except that one of the events in $\Omega=\{HH,HT,TH,TT\}$ will happen. Hence the information at time $0$ that they both know is $\mathcal{F}_0=\{\emptyset,\Omega\}$.

At the time $1$, the coin had been tossed only once; and they know that the events in the $\sigma$-field $\mathcal{F}_1=\{\emptyset, \Omega, \{HH,HT\},\{TH,TT\}\}\supset \mathcal{F}_0 $ could happen.

At the time $2$, the coin had been tossed twice; and they know that the events in the $\sigma$-field $\mathcal{F}_2=\{\emptyset, \Omega,\{HH,HT\},\{TH,TT\},\{HH\},\{HT\},\{TH\},\{TT\}\}\supset \mathcal{F}_1$ could happen which means they know everything about the gambling results.

Please notice the evolution of information characterized by the filtrations $\mathcal{F}_0,\mathcal{F}_1,\mathcal{F}_2.$ With time passing, the unknown world $\Omega$ is divided more finely. It is something like water flows through pipes.

Assuming that they bet on the following results and the coin is fair. $$X(\omega)=\left\{ \begin{array}{l} 2, \omega=HH,\mbox{means the first tossing is H, and the second tossing is H}\\ 1, \omega=HT,\mbox{means the first tossing is H, and the second tossing is T}\\ 1, \omega=TH,\mbox{means the first tossing is T, and the second tossing is H} \\ 0, \omega=TT,\mbox{means the first tossing is T, and the second tossing is T}\\ \end{array} \right.$$

Then, we have

$$E[X|\mathcal{F}_0](\omega)=1\qquad\text{for every}\ \omega $$ $$E[X|\mathcal{F_2}](\omega)=X(\omega)\qquad\text{for every}\ \omega $$ $$E[X|\{HH,HT\}]=2P(HH|\{HH,HT\})+1P(HT|\{HH,HT\})$$ $$+1P(TH|\{HH,HT\})+0P(TT|\{HH,HT\})=\frac{3}{2}$$ $$E[X|\{TH,TT\}]=2P(HH|\{TH,TT\})+1P(HT|\{TH,TT\})$$ $$+1P(TH|\{TH,TT\})+0P(TT|\{TH,TT\})=\frac{1}{2} $$

$$E[X|\mathcal{F_1}](\omega)=\left\{ \begin{array}{l} \frac{3}{2}, \omega\in \{HH,HT\}\\ \frac{1}{2}, \omega \in \{TH,TT\} \end{array} \right. $$

I hope those would be helpful.

BCLC
  • 13,459
Jun Deng
  • 530
  • 9
    Unfortunately, $\mathcal F_2$ is not a sigma-field. – Did Aug 14 '15 at 17:23
  • 1
    $\Omega={HH,HT,TH,HH}$? The last $HH$ should probably be $TT$. – Cm7F7Bb Nov 05 '15 at 12:23
  • 2
    Is it correct? It seems that $\mathcal{F_2}$ is not a sigma-algebra, since ${HH}\cup{{TT}} = {HH, TT}$ is not in sigma-algebra. ${TT}^{c}$ is not in sigma algebra. – user0347284 Jul 28 '17 at 15:47
  • How is knowing what could happen the same as knowing results and vice versa? This is the crux of what I don't understand about this interpretation. – lpnorm Sep 06 '21 at 10:57
8

As pointed out in the comments in the previous answer, the collection $\mathcal{F}_2$ is not a $\sigma$-algebra because is not closed under unions and intersections.

The right argument is the following:

At time 0, A and B do not know anything about the result except that one of the events in $\Omega:=\{HH,HT,TH,TT\}$ will happen. Hence the information at time 0 that they both can talk about is the $\sigma$-algebra generated by a unique set $\Omega$, say $\mathcal{F}_0$.

At time 1, the coin had been tossed only once; and that they know are the events in the collection $\{\{HH,HT\},\{TH,TT\}\}$. Hence the information at time 1 that they both can talk about is the $\sigma$-algebra generated by latter collection of sets, say $\mathcal{F}_1$.

At time 2, the coin had been tossed twice; and they know that the events in the collection $\{\{HH\},\{HT\},\{TH\},\{TT\}\}$ could happen which means they know everything about the gambling results. Thus the information at time 2 that they both can talk about is the $\sigma$-algebra generated by latter collection of sets, say $\mathcal{F}_2$.

Since at each time $t>0$ each generating collection is formed with partions of the sets of the previous collection, clearly one has $\mathcal{F}_0\subset\mathcal{F}_1\subset\mathcal{F}_2$.

Notice that in each time $t$ the generation collection is the finer $\mathcal{F}_t$-measurable partition of $\Omega$.

2

$\mathcal{F}_0=\{\emptyset,\Omega\}$

$\mathcal{F}_1=\sigma(\{HH,HT\},\{TH,TT\})=\{\emptyset, \Omega, \{HH,HT\},\{TH,TT\}\}\supset \mathcal{F}_0$

$\mathcal{F}_2=\sigma(\{HH\},\{HT\},\{TH\},\{TT\})=\{\emptyset, \Omega, \{HH\},\{HT\},\{TH\},\{TT\},\{HH,HT\},\{HH,TH\},\{HH,TT\},\{HT,TH\},\{TH,TT\},\{HT,TT\},\{HH,TH,HT\},\{HH,HT,TT\},\{HH,TH,TT\},\{HT,TH,TT\}\}\supset \mathcal{F}_1\supset \mathcal{F}_0$