0

Recently, I posted the following question : Can Markov Chains be used to Disprove the St Petersburg Paradox?.

In one of the comments, it was written: "a variable with infinite mean, will always produce a finite value". I am trying to understand what this statement means and how to prove it.

As my original question was about the St Petersburg Paradox (https://en.wikipedia.org/wiki/St._Petersburg_paradox), I thought this comment was referring to the idea that "the St Petersburg Paradox has a 0 Probability of Infinite Reward".

I will now try to prove this.

Part 1: St Petersburg games has Infinite Reward

  • A fair coin is tossed at each stage.
  • The initial stake begins at 2 dollars and is doubled every time tails appears.
  • The first time heads appears, the game ends and the player wins whatever is the current stake

As we can see, this game will have an expected reward of infinite dollars:

$$E(X) = \sum_{i=1}^{\infty} x_i \cdot p_i$$

$$E = \sum_{n=1}^{\infty} \frac{1}{2^n} \cdot 2^n = \frac{1}{2} \cdot 2 + \frac{1}{4} \cdot 4 + \frac{1}{8} \cdot 8 + \frac{1}{16} \cdot 16 + ... = 1 + 1 + 1 + 1 + ... = \sum_{n=1}^{\infty} 1 = \infty$$

Part 2: There is a 0 probability of getting Infinite Reward

Here is my attempt to prove this.

Let $X$ be the random variable that represents the number of coin flips until the first heads. Then $X$ follows a geometric distribution with parameter $p = 1/2$, meaning that $P(X = k) = (1/2)^k$ for any positive integer $k$. The reward for the game is $Y = 2^X$, which is a function of $X$. We want to show that $P(Y = \infty) = 0$, meaning that there is 0 probability of getting infinite reward.

To do this, I think we can first use the basic laws of probability. That is, we can use the fact that for any event $A$, $P(A) = P(A \cap B) + P(A \cap B^c)$, where $B$ is any other event and $B^c$ is its complement.

If we choose $B$ to be the event $\{X \leq n\}$, where $n$ is any positive integer. Then we have:

$$P(Y = \infty) = P(Y = \infty \cap \{X \leq n\}) + P(Y = \infty \cap \{X > n\})$$

We can see that the first term on the right-hand side is 0, because if $X \leq n$, then $Y = 2^X \leq 2^n$, which is finite. Therefore, we have:

$$P(Y = \infty) = P(Y = \infty \cap \{X > n\})$$

Also, the second term on the right-hand side is bounded above by $P(X > n)$, because $Y = \infty$ implies $X > n$. Therefore:

$$P(Y = \infty) \leq P(X > n)$$

Using the formula for the geometric distribution, we can calculate $P(X > n)$ as follows (we need to use geometric series to simplify some of the terms in the equation below):

$$P(X > n) = \sum_{k = n + 1}^{\infty} P(X = k) = \sum_{k = n + 1}^{\infty} (1/2)^k = (1/2)^n \sum_{k = 0}^{\infty} (1/2)^k = (1/2)^n \frac{1}{1 - 1/2} = (1/2)^{n - 1}$$

This shows that $P(X > n)$ decreases exponentially as $n$ increases, and approaches 0 as $n$ goes to infinity. Therefore, we have:

$$\lim_{n \to \infty} P(Y = \infty) \leq \lim_{n \to \infty} P(X > n) = 0$$

This implies that $P(Y = \infty) = 0$, and thus the original claim is proven.

My Question:

  • 2
    It’s not true that a variable with an infinite mean will always produce a finite value (unless we are defining a random variable to be real-valued rather than extended-real-valued). What’s true is that it’s possible for a random variable that’s always finite to have an infinite mean, as is the case for the game payoff. – spaceisdarkgreen Feb 21 '24 at 16:33
  • @spaceisdarkgreen I suspect user:leonboy, who wrote the comment from which this phrase is extracted, was saying (a) in the particular case of the St Petersburg game the probability of a finite outcome is $1$ even if the expectation is infinite, and (b) in general any distribution supported on (a subset of) the real numbers will have a finite outcome whether or not it has a finite or infinite expectation (or no expectation). But only leonboy can give a definitive response – Henry Feb 21 '24 at 17:28
  • @ spaceisdarkgreen: thank you for these clarifications! – Uk rain troll Feb 21 '24 at 18:14
  • @ Henry: thank you for your reply! Yes, user:leonboy was the one who left this comment - perhaps user:leonboy will be able to provide clarifications. – Uk rain troll Feb 21 '24 at 18:15
  • @spaceisdarkgreen and Henry: is the math I have done in Part 2 correct? – Uk rain troll Feb 21 '24 at 18:16
  • @pnaxso yes it suffices to observe that the probability of more than $n$ heads in a row goes to $0$ as $n$ goes to infinity. – spaceisdarkgreen Feb 21 '24 at 18:20
  • @ spaceisdarkgreen: thank you so much! I have a learning disability (as I grow older, I have grown more comfortable to admit it :) ) and I struggle a lot with even simple math proofs. But I am determined to keep trying to learn everyday and push myself further. There is a poem by the Chilean poet Pablo Neruda where he writes, "I loved here, and sometimes she loved me too". If Mr Neruda would allow me, I would modify his poem about my life " I love math, but math does not love me back!" lol! thank you so much for all your support in the last few days - I really appreciate it! :) – Uk rain troll Feb 21 '24 at 18:26

1 Answers1

2

What precisely do you mean by "a variable with infinite mean will always produce a finite value"?

An ordinary real-valued random variable, by definition, can only take real values: it is a measurable function from a probability space $(\Omega, \mathcal{F}, P)$ to $\mathbb R$. And real numbers are finite. So any random variable, in this sense, will always produce a finite value.

On the other hand, you can have random variables with extended real values, where $+\infty$ and $-\infty$ are allowed. Such a random variable might not always produce a finite value. If $\mathbb P(X = +\infty)$ or $\mathbb P(X = -\infty)$ is nonzero, it won't even almost always produce a finite value.

Now, what about "infinite mean"? For simplicity, let's restrict ourselves to nonnegative discrete random variables, where the possible values are nonnegative integers and $+\infty$. The mean is $+\infty$ if $\mathbb P(X = +\infty) > 0$, otherwise $\sum_{x = 0}^\infty x \;\mathbb P(X = x)$ (which also might be $+\infty$). The latter is the case in the St. Petersburg paradox. So it's quite possible to have a random variable with infinite mean that always produces finite values, but it's also possible to have a random variable (in the extended sense) that has infinite mean and sometimes produces infinite values.

Robert Israel
  • 448,999
  • @ Robert Israel: thank you so much for your answer! Can you please provide comments - is the math that I did in Part 2 of my question correct? thank you so much for everything! – Uk rain troll Feb 21 '24 at 18:17
  • It is correct, but could be stated more simply: $\mathbb P(Y = \infty) = \mathbb P(X = \infty) = 0$ because $P(X < \infty) = \sum_{n=1}^\infty \mathbb P(X = n) = \sum_{n=1}^\infty 1/2^n = 1$. – Robert Israel Feb 22 '24 at 00:11
  • Wow, this is so elegant! Thank you so much for all your help and support! – Uk rain troll Feb 22 '24 at 02:38
  • The other day I had another idea: perhaps Markov Chains can be used to show that there is a definite Absorption Probability in the St Petersburg Paradox - therefore the game will never result an infinite reward? I showed my math analysis/simulation over here: https://math.stackexchange.com/questions/4865929/can-markov-chains-be-used-to-disprove-the-st-petersburg-paradox – Uk rain troll Feb 22 '24 at 02:40
  • I have been working on these questions here and I have been stuck for a while - can you please take a look at them if you have time? https://math.stackexchange.com/questions/4864953/understanding-the-st-petersburg-paradox and https://math.stackexchange.com/questions/4864377/why-is-the-brownian-motion-related-to-the-normal-distribution . Thank you so much for all your help! – Uk rain troll Feb 22 '24 at 02:41