Recently, I posted the following question : Can Markov Chains be used to Disprove the St Petersburg Paradox?.
In one of the comments, it was written: "a variable with infinite mean, will always produce a finite value". I am trying to understand what this statement means and how to prove it.
As my original question was about the St Petersburg Paradox (https://en.wikipedia.org/wiki/St._Petersburg_paradox), I thought this comment was referring to the idea that "the St Petersburg Paradox has a 0 Probability of Infinite Reward".
I will now try to prove this.
Part 1: St Petersburg games has Infinite Reward
- A fair coin is tossed at each stage.
- The initial stake begins at 2 dollars and is doubled every time tails appears.
- The first time heads appears, the game ends and the player wins whatever is the current stake
As we can see, this game will have an expected reward of infinite dollars:
$$E(X) = \sum_{i=1}^{\infty} x_i \cdot p_i$$
$$E = \sum_{n=1}^{\infty} \frac{1}{2^n} \cdot 2^n = \frac{1}{2} \cdot 2 + \frac{1}{4} \cdot 4 + \frac{1}{8} \cdot 8 + \frac{1}{16} \cdot 16 + ... = 1 + 1 + 1 + 1 + ... = \sum_{n=1}^{\infty} 1 = \infty$$
Part 2: There is a 0 probability of getting Infinite Reward
Here is my attempt to prove this.
Let $X$ be the random variable that represents the number of coin flips until the first heads. Then $X$ follows a geometric distribution with parameter $p = 1/2$, meaning that $P(X = k) = (1/2)^k$ for any positive integer $k$. The reward for the game is $Y = 2^X$, which is a function of $X$. We want to show that $P(Y = \infty) = 0$, meaning that there is 0 probability of getting infinite reward.
To do this, I think we can first use the basic laws of probability. That is, we can use the fact that for any event $A$, $P(A) = P(A \cap B) + P(A \cap B^c)$, where $B$ is any other event and $B^c$ is its complement.
If we choose $B$ to be the event $\{X \leq n\}$, where $n$ is any positive integer. Then we have:
$$P(Y = \infty) = P(Y = \infty \cap \{X \leq n\}) + P(Y = \infty \cap \{X > n\})$$
We can see that the first term on the right-hand side is 0, because if $X \leq n$, then $Y = 2^X \leq 2^n$, which is finite. Therefore, we have:
$$P(Y = \infty) = P(Y = \infty \cap \{X > n\})$$
Also, the second term on the right-hand side is bounded above by $P(X > n)$, because $Y = \infty$ implies $X > n$. Therefore:
$$P(Y = \infty) \leq P(X > n)$$
Using the formula for the geometric distribution, we can calculate $P(X > n)$ as follows (we need to use geometric series to simplify some of the terms in the equation below):
$$P(X > n) = \sum_{k = n + 1}^{\infty} P(X = k) = \sum_{k = n + 1}^{\infty} (1/2)^k = (1/2)^n \sum_{k = 0}^{\infty} (1/2)^k = (1/2)^n \frac{1}{1 - 1/2} = (1/2)^{n - 1}$$
This shows that $P(X > n)$ decreases exponentially as $n$ increases, and approaches 0 as $n$ goes to infinity. Therefore, we have:
$$\lim_{n \to \infty} P(Y = \infty) \leq \lim_{n \to \infty} P(X > n) = 0$$
This implies that $P(Y = \infty) = 0$, and thus the original claim is proven.
My Question:
- Is the analysis I have done in Part 2 mathematically correct?
- Does the analysis I have done correspond to the statement "a variable with infinite mean, will always produce a finite value"? Is there a different way to prove that "a variable with infinite mean, will always produce a finite value"? E.g. A Cauchy Probability Distribution has infinite mean yet we can still randomly simulate numbers from the Cauchy Distribution How to generate a Cauchy random variable , https://stat.ethz.ch/R-manual/R-devel/library/stats/html/Cauchy.html