1

In the section on the law of large numbers, my introductory probability book gives the following example:

Let $X_1, X_2, ...$ be i.i.d. $Bern(1/2)$. Interpreting the $X_j$ as indicators of Heads in a string of fair coin tosses, $\bar{X}_n$ is the proportion of Heads after $n$ tosses. The strong law of large numbers says that with probability $1$, when the sequence of random variables $\bar{X}_1, \bar{X}_2, \bar{X}_3$, ... crystallises into a sequence of numbers, the sequence of numbers will converge to $1/2$. Mathematically, there are bizarre outcomes such as $HHHHHH...$ and $HHTHHTHHTHHT...$, but collectively they have zero probability of occurring.

I'm specifically interested in this statement:

... Mathematically, there are bizarre outcomes such as $HHHHHH...$ and $HHTHHTHHTHHT...$, but collectively they have zero probability of occurring.

Intuitively, I can see where this is coming from, since we're going to have an infinite number of outcomes, thus drowning out this collection of outcomes as less and less probable, until their probability approaches $0$ as $n \to \infty$.

But when I think about the implications of this, why won't the same be true for any other group of outcomes that we select?

I'd appreciate it if someone could please clarify as to why this concept is true. Why is it that the probability of outcomes such as $HHHHHH...$ and $HHTHHTHHTHHT...$ occurring becomes $0$, but probability of other outcomes does not?

1 Answers1

0

It is true that in this case, each individual outcome, or finite collection of outcomes has a probability of zero of happening, regardless of how 'weird' it is. However certain infinite collections of outcomes have an nonzero probability of occurring (and we call collections of outcomes with well-defined probabilities events, whether they be finite or infinite collections).

What the strong law of large numbers says is that the event that the binary sequence converges to $1/2$ occurs with probability $1.$ So while any outcome you see at the end of the day is equally unlikely, you can be sure that it will come from the set of outcomes where the sequence converges to $1/2.$ This is just a long-winded, but hopefully clarifying way of saying that you can be sure (or to use the technical language, almost sure since outcomes from outside this set are often technically possible, although they will occur so infrequently as to make up a negligible portion "in the long run") that the sequence converges to $1/2$.

A quick non-rigorous argument that you won't see any infinitely repeating outcomes is to note that these correspond to rational numbers, which are a countable set, while the set of all outcomes correspond to the real numbers, which are uncountable. So there are very few infinitely repeating outcomes. To strengthen that to where you can show that the set of all outcomes that don't converge to $1/2$ is similarly sparse (though it happens to be uncountable... countable implies sparse in this scenario but not the other way around) requires you to prove the SLLN, at least for this instance (which is very easy compared to the general case). The technical term for sparse is measure zero.

  • Thanks. Can you please elaborate on what you mean by "certain infinite collections of outcomes have an nonzero probability of occurring"? What are "infinite collections of outcomes"? –  Feb 24 '18 at 06:57
  • 1
    @handler'shandle The event "the sequence is either all heads or all tails" is a finite set of outcomes since there are only two that qualify. The event "the sequence has at least one H" is an infinite set of outcomes since an infinite number of outcomes will qualify, like any sequence starting like "HTTH..." "TTTTTHTTH...", "THTTTHTHH..." will. In fact there's only one that won't: the sequence of all tails. So it shouldn't be surprising that this event occurs with probability one. – spaceisdarkgreen Feb 24 '18 at 07:03