Let $\Omega = \{ \omega = (\omega_1, \omega_2, \ldots) : \omega_j = 1 \text{ or } 0 \}$. For each positive integer $n$, let $\Omega_n = \{ \omega = (\omega_1, \ldots, \omega_n) : \omega_j = 1 \text{ or } 0 \}$. We can consider $\Omega_n$ as a probability space with $\sigma$-algebra $ 2^{\Omega_n}$ and probability induced by $\mathbb{P}_n(\omega) = 2^{-n}$. We define $F_n$ to be the collection of all subsets $A$ of $\Omega$ such that there is an $E \in 2^{\Omega_n}$ with \begin{equation} A = \{(\omega_1, \omega_2, \ldots) : (\omega_1, \ldots, \omega_n) \in E\}. \tag 1 \end{equation} $F_n$ is a finite $\sigma$-algebra (containing $2^{2^n}$ subsets) and $F_1 \subset F_2 \subset F_3 \subset \cdots$ (i.e., an ascending sequence of $\sigma$-algebras). If $A$ is of the form $(1),$ we let $\mathbb{P}(A) = \mathbb{P}_n(E_n)$. This gives a function $\mathbb{P}$ on on \begin{equation} F^{0} = \bigcup_{j=1}^{\infty} F_j \end{equation} This is followed by the proposition that $F^{0}$ is an algebra but not a $\sigma$-algebra. All of the above and most of the proof makes sense to me. But in the proof for the proposition I just mentioned states: $ \Omega \in F_0$ since $\Omega \in F^1$. I get a bit confused because $F_1$ should just consist of the first flip being tails and the first flip being heads. Does this mean that $F_1$ consists of each as the first element in two infinite sequences where such as $(1, \omega_2, \ldots)$ and $(0, \omega_2, \ldots)$, where $\omega_j, j > 1$ are all just not given? But then doesn't $F_1$ trivially contain all possible sequences? Maybe it's because I'm a bit rusty on measure theory or I'm missing something but I'm a bit confused.
3 Answers
Note that $F_n$ is not a subset of $\Omega$, it is a set of subsets of $\Omega$. So it doesn't make sense to ask about $F_1$ containing all possible infinite sequences.
By definition, $F_1$ consists of four subsets of $\Omega$ (corresponding to the four subsets of $\Omega_1$). These four subsets are:
$\Omega$ (corresponding to $E=\Omega_1$)
$\emptyset$ (corresponding to $E=\emptyset$)
all sequences in $\Omega$ that start with $0$ (corresponding to $E=\{(0)\}$)
all sequences in $\Omega$ that start with $1$ (corresponding to $E=\{(1)\}$)
In general, $\Omega_n$ has size $2^n$ hence has $2^{2^n}$ subsets. These $2^{2^n}$ subsets of $\Omega_n$ correspond to $2^{2^n}$ subsets of $\Omega$, which together make up the collection $F_n$. The correspondence is: start with a subset $E$ of $\Omega_n$ (so this is some collection of sequences of $0$'s and $1$'s of length $n$). Then define the subset $A_E$ of $\Omega$ to consist of all infinite sequences whose first $n$ entries are a sequence in $E$. Now $F_n$ is the collection $\{A_E:E\subseteq \Omega_n\}$.

- 3,454
-
So because $F_n$ is the collection of all subsets $A$ of $\Omega$ such that there is an $E \in 2^{\Omega_n}$ with $(\omega_1, \omega_2, ..., \omega_n) \in E$ then $\Omega$ is in $A$ because all sequences in $\Omega$ have an $E$ corresponding to all of the first $n$ possibilities? – layabout Jul 25 '20 at 16:06
-
I think you mean to say "then $\Omega$ is in $F_n$". But yes, if you let $E$ be all possibilities (i.e., $E=\Omega_n$) and then form the corresponding set $A$, you get precisely $\Omega$. – halrankard Jul 25 '20 at 16:12
-
Yep that's what I meant. Thanks a lot. – layabout Jul 25 '20 at 17:03
Say instead of $n=\infty$, you had $n=2$. I believe that $$F_0=\{\emptyset,\{\{0,0\},\{0,1\},\{1,0\},\{1,1\}\}\}$$ and $$F_1=\{\emptyset,\{\{0,0\},\{0,1\}\},\{\{1,0\},\{1,1\}\},\{\{0,0\},\{0,1\},\{1,0\},\{1,1\}\}\}.$$ This means that, after the first flip, you know whether heds or tails turned up in that first flip, something you did not know before the flip. Compare to $F_2$ $$F_2=\{\emptyset,\{\{0,0\}\},\{\{0,1\}\},\{\{1,0\}\},\{\{1,1\}\},\{\{0,0\},\{0,1\}\},\{\{1,0\},\{1,1\}\},\{\{0,0\},\{0,1\},\{1,0\},\{1,1\}\}\},$$
where you can know the full history of flips. For $n=\infty$, the number of terms in each $F_i$ is the same.

- 1,604
This is to complement the answers others have posted to your question.
There are many probability spaces where one can define precisely the random variables that model the crossing of a coin. Surely, the product space $\{0,1\}^\mathbb{N}$ with the product $\sigma$-algebra is one.
Here is another one which can also be considered canonical:
Consider the unit interval in the real line with the Borel $\sigma$-algebra and the Lebesgue measure $\lambda$ on it, that is $([0,1],\mathscr{B}[0,1],\lambda)$. Notice that in the this space, the identity function $\theta(x)=x$ is a uniformly distributed $U[0,1]$ random variable.
Recall that every $x\in[0,1]$ has a unique binary expansion $$x=\sum_{n\geq1}r_n/2^n$$ where $r_n\in\{0,1\}$, and $\sum_{n\geq1}r_n=\infty$ for $x>0$. For each $n\in\mathbb{N}$, the $n$--th bit map $x\mapsto r_n(x)$ defines a measurable function from $([0,1],\mathscr{B}([0,1]))$ to $(\{0,1\},2^{\{0,1\}}))$, where $2^{\{0,1\}}$ is the collection of all subsets of $\{0,1\}$.
We will see that
- The map $\beta:[0,1]\rightarrow\{0,1\}^{\mathbb{N}}$ given by $x\mapsto(r_n(x))$ is measurable, that is, it is a random variable taking values in $\{0,1\}$.
- $\{r_n:n\in\mathbb{N}\}$ is an i.i.d. sequence of Bernoulli random variables.
Lemma 1: Suppose $\theta$ is a uniformly 0-1 distributed random variable defined in some probability space $(\Omega,\mathscr{F},\mathbb{P})$. Define $\{X_n=r_n\circ\theta\}$. Then, $\{X_n\}$ is an i.i.d. Bernoulli sequence with rate $p=\tfrac12$. Conversely, if $(Y_n)$ is an i.i.d. Bernoulli sequence with rate $p=\tfrac12$, then $\theta=\sum_{n\geq1}2^{-n}Y_n\sim U[0,1]$.
Here is a short proof:
Suppose that $\theta\sim U(0,1)$. For any $N\in\mathbb{N}$ and $k_1,\ldots,k_N\in\{0,1\}$, $$\begin{align} \bigcap^N_{j=1}\{x\in(0,1]:r_j(x)=k_j\}&=&(\sum^N_{j=1}\tfrac{k_j}{2^j}, \sum^N_{j=1}\tfrac{k_j}{2^j}+\tfrac{1}{2^N}]\tag{1}\label{one}\\ \{x\in(0,1]: r_N(x)=0\}&=&\bigcup^{2^{N-1}-1}_{j=0}(\tfrac{2j}{2^N},\tfrac{2j+1}{2^N}]\tag{2}\label{two}\\ \{x\in(0,1]:r_N(x)=1\}&=&\bigcup^{2^{N-1}-1}_{j=0} (\tfrac{2j+1}{2^N},\tfrac{2(j+1)}{2^N}]\tag{3}\label{three} \end{align} $$ It follows immediately that $x\mapsto (r_n(x):n\in\mathbb{N})$ is measurable, and that $ \mathbb{P}[\bigcap^N_{j=1}\{X_j=k_j\}]=\tfrac{1}{2^N}=\prod^N_{j=1}\mathbb{P}[X_j=k_j]$. Hence $\{X_n\}$ is an i.i.d. Bernoulli($\tfrac12$) sequence.
Conversely, suppose $\{Y_n:n\geq1\}$ is a Bernoulli sequence with rate $\tfrac12$. Let $\widetilde{\theta}$ be a $U(0,1)$-distributed random variable defined in some probability space $(\Omega,\mathscr{F},\mathbb{P})$ (for examples $\widetilde{\theta}(t)=t$ on $([0,1],\mathscr{B}([0,1]),\lambda)$). Then, the first part shows that the sequence of bits $\{\widetilde{Y}_n\}\stackrel{law}{=}\{Y_n\}$. Therefore, $$ \theta:=\sum_{n\geq1}2^{-n}Y_n\stackrel{law}{=} \sum_{n\geq1}2^{-n}\widetilde{Y}_n=\widetilde{\theta} $$ since $\theta$ is a measurable function of $\{Y_n\}$.

- 39,145