After thinking about it a few years, I still don't quite understand the precise link between sigma algebras and information. (Yes--I know that there's already dozens of questions on MSE by other people who had exactly the same problem. It still remains unclear to me.)
The first problem I have is the following. Let $\mathcal G$ be a sigma algebra on some set and $X$ be a real-valued function defined on the same set. Then I read the following sentence: "Note that if $X$ is $\mathcal G$-measurable, then the information in $\mathcal G$ determines the value of $X$." What does this statement mean exactly?
Let's consider the following example. Suppose we do an experiment in which we flip a coin three times. The set of outcomes is $ \Omega = \{ HHH,HHT, HTH, HTT, THH, THT, TTH, TTT \}.$ Suppose only partial information is revealed; say only the information from the first toss. This partial information is given by the sigma algebra $\mathcal{F}_1 = \{ A_H, A_T , \emptyset , \Omega \}$ where $$ A_H = \{ HHH,HHT, HTH, HTT \} \quad \text{and} \quad A_T = \{ THH, THT, TTH, TTT \}.$$ In which exact sense does this sigma algebra "represent" this information?
Either way, we can of course continue and consider the (still partial) information after the first two coin tosses. This sigma algebra is equal to $\mathcal{F}_2 = \{ A_H, A_T, A_{HH}, A_{HT}, A_{TH}, A_{TT} , \ldots , \emptyset , \Omega \}$, where I've generalized the notations from before straightforwardly ($A_{HH}, A_{HT}$ etc.). (I wrote dots here for brevity--obviously though the set needs to be closed under the operations so that it becomes a sigma algebra.) The same question: In which sense does this represent the information?
In general, I do not understand this whole analogy where some $\omega \in \Omega$ is drawn and it's about "asking questions" whether or not $\omega \in A$, say, where $A$ is some event.
Now let's continue with the above example. Let $S_0, S_1, S_2, S_3 \colon \Omega \to \mathbb{R}$ be defined by $S_0 = 4$ and for $i \in \{0,1,2\}$ $$ S_{i+1} = \begin{cases} 2 S_i & \text{if $i+1$th toss} = H,\\ S_i / 2 & \text{if $i+1$th toss} =T. \\ \end{cases} $$ I want to find the sigma algebra $\sigma(S_2)$ generated by $S_2$ (note that it can only attain three values: $16$, $4$, or $1$). Clearly, $\sigma(S_2)$ is the sigma algebra generated by the events $(S_2 = 16)$, $(S_2 = 4)$, and $(S_2= 1)$. These are equal to $A_{HH}$, $A_{HT} \cup A_{TH}$, and $A_{TT}$, respectively. Closing it under all operations, we find that $$ \sigma(S_2)= \{ A_{HH}, A_{HT} \cup A_{TH} , A_{TT} , A_{HT} \cup A_{TH} \cup A_{TT}, A_{HH}\cup A_{TT}, A_{HH}\cup A_{HT}\cup A_{TH}, \emptyset , \Omega \}. $$ Note that $\sigma(S_2) \subset \mathcal{F}_2$ so $\mathcal{F}_2$ is a (strictly) finer sigma algebra.
The following statement (that I don't understand) is made : "If we know in which element of $\mathcal{F}_2$ our outcome lies, then we know the value of $S_2$." Let's pick the element $A_{HH}\cup A_{HT}\cup A_{TH}$ (which is indeed an element of $\mathcal{F}_2$) for example. If we know that our element (our outcome) lies in this set, then how does this tell us the value of $S_2$? It could then still be either $16$ or $4$. It doesn't make any sense whatsoever! So my conclusion is that there are fundamental things about this that I'm just not getting.
All this talk about "information" seems more "magic" than "mathematics".