While studying probability, the following question arose:
Let $H$ be an event and let $\mathcal{H}=\lbrace H_\lambda|\lambda\in\Lambda\rbrace$ be a family of events in probability space $(\Omega,\mathcal{F},P)$, such that for every $\lambda\in\Lambda$ the following holds: $P(H_\lambda\cap H) = P(H_\lambda)P(H)$, i.e. the events $H_\lambda$ and $H$ are independent. Let $\mathcal{G}=\sigma(\mathcal{H})$ be the $\sigma$-algebra generated by $\mathcal{H}$. Let $G\in\mathcal{G}$ be any event. Are $G$ and $H$ necessarily independent i.e. does it follow that $P(G\cap H) = P(G)P(H)$?
This would be quite a useful lemma, I think. I had an idea for a proof, but the last step didn't quite work as expected. The argument went like this:
We shall write $\mathcal{G}$ as a union of an increasing sequence of more simple sets. Let $\mathcal{B}_0 = \mathcal{H}$. For every successor ordinal $\alpha+1$ define $\mathcal{A}_{\alpha+1} = \lbrace\bigcup\mathcal{J}|\mathcal{J}\subseteq\mathcal{B}_\alpha,\mathrm{card}(\mathcal{J})\leq\aleph_0\rbrace$, the set of all countable unions of the previous sets, and $\mathcal{B}_{\alpha+1}=\lbrace A|\Omega - A\in\mathcal{A}_{\alpha+1} \lor A\in\mathcal{A}_{\alpha+1}\rbrace$, the same with their complements added. For limit ordinals we define $\mathcal{B}_\beta=\bigcup_{\alpha<\beta}\mathcal{B}_\alpha$. Finally define $\mathcal{B} = \bigcup_{\alpha<\omega_1}\mathcal{B}_\alpha$. I guess such an union should make sense, since at each step we stay inside $\mathcal{G}$ ...
Next we prove that $\mathcal{B}$ is a $\sigma$-algebra and since every set in the construction of this $\sigma$-algebra is a subset of $\mathcal{G}$, we must have that $\mathcal{B} = \mathcal{G}$.
The only tricky part in proving $\mathcal{B}$ is a $\sigma$-algebra is closure under countable unions. Let $(A_n)_n$ be a sequence of events in $\mathcal{B}$. Then for each $n\in\mathbb{N}$ there is an ordinal $\alpha_n$ such that $A_n\in\mathcal{B}_{\alpha_n}$. Then there must be some ordinal $\gamma < \omega_1$ such that $\forall n:\alpha_n \leq \gamma$. (Since otherwise $\omega_1$ would be a countable union of countably many sets which it can't be, since it isn't countable. (Assuming the axiom of choice.)) So these events are all elements of $\mathcal{B}_\gamma$ which implies their countable union must lie in $\mathcal{A}_{\gamma+1}\subseteq\mathcal{B}_{\gamma+1}$ and therefore in $\mathcal{B}$.
I was hoping the rest would follow by transfinite induction: if $A\in\mathcal{B}_{\alpha+1}$ then either $A\in\mathcal{A}_{\alpha+1}$ or $\Omega-A\in\mathcal{A}_{\alpha+1}$. The second case would follow from the first case using complements. But the first case is problematic: $P(A\cap H) = P((\bigcup_{E\in\mathcal{J}}E)\cap H) = P((\bigcup_{\tilde{E}\in\mathcal{J}_0}\tilde{E})\cap H) = \sum_{\tilde{E}\in\mathcal{J}_0}P(\tilde{E}\cap H)$. Here $\mathcal{J}\subseteq\mathcal{B}_\alpha$ exists by definition of $\mathcal{A}_{\alpha+1}$ and $\mathcal{J_0}$ is a set of mutually exclusive events giving the same union. The problem is that such a set $\mathcal{J_0}$ can in this case only be proven to lie under $\mathcal{B}_{\alpha+1}$, so we cannot write $P(\tilde{E}\cap H) = P(\tilde{E})P(H)$.
So the proof sadly fails at this last step.
Is this proof salvageable? (Perhaps by taking relative complements instead in the definition of $\mathcal{B}_{\alpha+1}$ or something like that?) Does such a lemma even hold or do we have to modify it? Are such proofs by transfinite induction useful in probability?
It seems to me probabilists implicitly use lemmas like this all the time, so I am also wondering if such a lemma or a similar one would in fact be useful.
[Comment: The definition of $\mathcal{B}$ above originally used $\mathbf{On}$ which was a slight overkill, so I changed it to $\omega_1$, following the kind suggestion of Asaf Karagila.]
Added: In a comment below Dilip Sarwate suggests the following variation on the problem:
Let $H$ be an event and let $\mathcal{H}=\lbrace H_\lambda|\lambda\in\Lambda\rbrace$ be a family of events in probability space $(\Omega,\mathcal{F},P)$, such that the family of events $\mathcal{H}\cup\lbrace H\rbrace$ is independent i.e. for every finite $\mathcal{S}\subseteq\mathcal{H}\cup\lbrace H\rbrace$ we have $P(\bigcap_{E\in\mathcal{S}}E) = \Pi_{E\in\mathcal{S}}P(E) $, where $\Pi$ denotes the product of the probabilities, as usual. Let $\mathcal{G}=\sigma(\mathcal{H})$ be the $\sigma$-algebra generated by $\mathcal{H}$. Let $G\in\mathcal{G}$ be any event. Are $G$ and $H$ necessarily independent i.e. does it follow that $P(G\cap H) = P(G)P(H)$?
This case actually interests me even more than the "original question" above, since it is this case that I actually needed. (I thought somehow that I can get more out of it by relaxing the conditions to what the question above says. Silly me.)