I don't think your proof is right because of the definitions of $A_n$ and $\mathscr F$. Are the flips independent?
Consider a probability space $(\Omega, \mathscr F, \mathbb P)$ where
- $\Omega = \{H,T\}^{\mathbb N}$
So we have $\omega = (\omega_1, \omega_2, ...)$ where $\omega_n \in \{H, T\} \ \forall n \in \mathbb N$
$\mathscr{F} = \sigma(\omega_n = W | \ W \in \{H, T\})$ like here (because I guess $\mathscr{F} = 2^{\Omega}$ doesn't work)
$P(\omega_n = H) = P(\omega_n = T) = 1/2$
where
$(\omega_n = H) = (\omega_1, \omega_2, ..., \omega_n = H, ...)$
$(\omega_n = T) = (\omega_1, \omega_2, ..., \omega_n = T, ...)$
In your case, $(\omega_n = H) = A_n$
So $P(A_n) = 1/2$ not $\frac{1}{2^n}$
Let $H_1, H_2, ...$ be events where $H_n$ = {nth flip is heads and 1st, ..., (n-1)th flips are tails}.
Thus, we have $$H_n = A_1^C \cap A_2^C \cap ... \cap A_{n-1}^C \cap A_n$$
$$=\bigcap_{k=1}^{n-1} [A_k^C] \cap A_n$$
Assuming independence of the flips i.e. independence of the $A_n$'s, $$P(H_n) = [\prod_{k=1}^{n-1} P(A_k^C)] \times P(A_n) = \frac{1}{2^n}$$
Now for Q1, we want to show that at least one of the flips will be heads
Or:
$$P(\bigcup_{n=1}^{\infty} A_n) = 1$$
Or:
Almost surely, $\forall \omega \in \Omega$,
$$\omega \in \bigcup_{n=1}^{\infty} A_n$$
Or:
$$\exists z \ge 1 s.t. P(A_z) = 1$$
Or:
Almost surely, $\forall \omega \in \Omega$,
$$\exists z \ge 1 s.t. \omega \in A_z$$
- One route: Now we can do what you attempted earlier because the $H_n$'s are pairwise disjoint (*): $$P(\bigcup_{n=1}^{\infty} H_n) = \sum_{n=1}^{\infty} P(H_n) = 1 \ \text{if the flips are independent}$$
Hence almost surely, $\forall \omega \in \Omega$,
$$\omega \in \bigcup_{n=1}^{\infty} H_n$$
$\to \exists! \ q \in \mathbb N$ s.t. $\omega \in H_q$
Observe that $H_q \subseteq A_q$.
Hence, $$\omega \in A_q \subseteq \bigcup_{n=1}^{\infty} A_n \ QED$$
Or prove that $$\bigcup_{n=1}^{\infty} H_n = \bigcup_{n=1}^{\infty} A_n$$
(*) They are pairwise disjoint because:
$\forall m > n$,
$$\omega \in H_n \cap H_m$$
$$\to \omega \in H_n \cap \omega \in H_m$$
So, $$\omega \in H_n \to \omega \in A_n$$
However, $$\omega \in H_m \to \omega \in A_n^c ↯ \ QED$$
- Another route:
$$P(\bigcup_{n=1}^{\infty} A_n) = 1 - P(\bigcap_{n=1}^{\infty} A_n^C)$$
$$= 1 - \prod_{n=1}^{\infty} P(A_n^C) \ \text{if the flips are independent}$$
$$= 1 - \prod_{n=1}^{\infty} (1/2)$$
$$= 1 - \lim_{m \to \infty} \prod_{n=1}^{m} (1/2)$$
$$= 1 - \lim_{m \to \infty} (1/2)^m = 1 - 0 = 1 \ QED$$
- Yet another route:
$$\because \sum_{n=1}^{\infty} P(A_n) = \infty,$$
by Borel-Cantelli Lemma 2, if the flips are independent, we have $P(\limsup A_n) = 1$
Observe that $$\limsup A_n \subseteq \bigcup_{n=1}^{\infty} A_n$$
Hence, by monotonicity of probability, $$P(\bigcup_{n=1}^{\infty} A_n) = 1 \ QED$$
Or:
Hence, almost surely, $\forall \omega \in \Omega, \forall m \ge 1, \exists n \ge m$ s.t.
$$\omega \in A_n \subseteq \bigcup_{n=1}^{\infty} A_n$$
For Q2, let $B_{n,r}$ be a block of length r where
$$B_{n,r} = \bigcap_{i=n}^{n+r-1} A_i*$$
where $A_i* = A_i$ or $A_i^C$
$$\because \sum_{n=1}^{\infty} P(B_{n,r}) = \sum_{n=1}^{\infty} (\frac{1}{2^r}) = \infty,$$
using BCL2 again gives us
$$P(\limsup B_{n,r}) = 1$$
This means that almost surely $\forall \omega \in \Omega, \forall m \ge 1, \exists n \ge m$ s.t. $\omega \in B_{n,r} \ QED$
To see why the statement doesn't hold for a block of infinite length, define $$B_{n, \infty} := \lim_{r \to \infty} B_{n, r}$$
By the continuity of probability, $P(B_{n, \infty}) = \lim_{r \to \infty} P(B_{n, r}) = \lim_{r \to \infty} \frac{1}{2^r} = 0$
$$\because \sum_{n=1}^{\infty} P(B_{n,\infty}) = \sum_{n=1}^{\infty} 0 < \infty,$$
BCL1 gives us
$$P(\limsup B_{n,\infty}) = 0$$
For Q3, Murphy's Law is 'Anything that can go wrong, will go wrong', w/c is technically false:
Flipping 10 coins is a 'thing'. If we define 'go wrong' to be 'at least one head', then we may have 10 tails.
Mathematically,
$P(\bigcup_{n=1}^{10} E_n) \ne 1$ even if $P(E_n) > 0$
or
$P(\bigcup_{n=1}^{\infty} E_n) \ne 1$ where $E_k = \emptyset$ for $k \ge 11$ even if $P(E_n) > 0$ for $n = 1, 2, ..., 10$
Even if we flip infinitely, but $P(E_k) = 0$ for $k \ge 11$, we still may not have $P(\bigcup_{n=1}^{\infty} E_n) = 1$
Let us add the condition that it is not the case that all but a finite number of the $E_n$'s have zero probability (the $E_n$'s have positive probability infinitely often) to have Murphy's Law #2.
To put Murphy's Law #2 mathematically,
$$P(E_n) > 0 \text{i.o} \to P(\bigcup_n E_n) = 1$$
Is Murphy's Law #2 true? If not, what are some sufficient conditions for Murphy's Law #2?
Case 0: $\exists z \in \mathbb N s.t. P(E_z) = 1$
Obviously, Murphy's Law #2 holds.
Case 1: $E_n$'s are independent, $P(E_n) < 1$
$$P(\bigcup_{n=1}^{\infty} E_n) = 1 - P(\bigcap_{n=1}^{\infty} E_n^C)$$
$$= 1 - \prod_{n=1}^{\infty} P(E_n^C) = 1$$
Case 2: $E_n$'s are not independent but disjoint and $\sum_n P(E_n) = 1$, $P(E_n) < 1$
$$P(\bigcup_{n=1}^{\infty} E_n) = \sum_n P(E_n) = 1$$
Case 3: $E_n$'s are disjoint, not independent but $\sum_n P(E_n) = \infty$, $P(E_n) < 1$
Impossible.
Case 4: $E_n$'s are not independent but $\sum_n P(E_n) = \infty$, $P(E_n) < 1$
$$\sum_n P(E_n) = \infty to P(\limsup E_n) = 1 \to P(\bigcup_{n=1}^{\infty} E_n) = 1$$
Just kidding. BCL2 needs independence.
Case 5: $E_n$'s are not independent but $1 < \sum_n P(E_n) < \infty$, $P(E_n) < 1$
Here, we have $P(\liminf E_n^C) = 1$. So for some m, $\omega \in E_m^C, E_{m+1}^C, ...$.
$\omega$ may or may not be in $\bigcup_{n=1}^{m-1} E_n$.
So Murphy's Law #2 does not hold.
To sum up:
Case 0 is obvious. Case 1 corresponds to Q1. Case 2 corresponds to Q2. Case 3 is impossible. Cases 4 and 5 suggest counterexamples.