In the section on the law of large numbers, my introductory probability book gives the following example:
Let $X_1, X_2, ...$ be i.i.d. $Bern(1/2)$. Interpreting the $X_j$ as indicators of Heads in a string of fair coin tosses, $\bar{X}_n$ is the proportion of Heads after $n$ tosses. The strong law of large numbers says that with probability $1$, when the sequence of random variables $\bar{X}_1, \bar{X}_2, \bar{X}_3$, ... crystallises into a sequence of numbers, the sequence of numbers will converge to $1/2$. Mathematically, there are bizarre outcomes such as $HHHHHH...$ and $HHTHHTHHTHHT...$, but collectively they have zero probability of occurring.
I'm specifically interested in this statement:
... Mathematically, there are bizarre outcomes such as $HHHHHH...$ and $HHTHHTHHTHHT...$, but collectively they have zero probability of occurring.
Intuitively, I can see where this is coming from, since we're going to have an infinite number of outcomes, thus drowning out this collection of outcomes as less and less probable, until their probability approaches $0$ as $n \to \infty$.
But when I think about the implications of this, why won't the same be true for any other group of outcomes that we select?
I'd appreciate it if someone could please clarify as to why this concept is true. Why is it that the probability of outcomes such as $HHHHHH...$ and $HHTHHTHHTHHT...$ occurring becomes $0$, but probability of other outcomes does not?