2

After some digging around, I need some help figuring out how to glue together pieces of information.

Let $X_i\sim U(0,1) $ i.i.d.

We also define our stopping time, $N$:

$$N=\min_{n\in\mathbb{N}}\Big\{{\sum_{i=1}^{n}X_i\geq 1000}\Big\}$$

We need to calculate $\mathbb{E}[N]$ , the expected stopping time.

I tried to use a pure intuition, and since $\mathbb{E}[X_i] = \frac{1}{2}$ I'm expecting something around 2,000 observations before I exceed $1,000$. In order to validate that intuition I simulated $1$ Milion summations and got:

$\mathbb{E}[\hat{N}] = 2000.668$ with $\sigma(\hat{N}) = 28.83$.

These findings cal align with both $\mathbb{E}[N] = 2000$ or $\mathbb{E}[N] = 2001$

Looking for the same question I came across this almost alike question, only when we want $N=\min_{n\in\mathbb{N}}\Big\{{\sum_{i=1}^{n}X_i\geq 1}\Big\}$, which results in $\mathbb{E}[N]=e$:

related question

I was trying to use Wald's Lemma, while trying to define:

$$N = N_1 + N_2 + ... + N_{1000}$$

where $N_1$ is the number of observations it takes to exceed $1$ in the sum, and $N_2$ is the number of observations it takes to exceed $2$ given that the sum is already exceeded $1$.

I can use the result I found to claim that $\mathbb{E}[N_1] = e$, but since $\sum_{i=1}^{N_1}X_i\geq 1$ and it doesn't equal to exactly $1$ the equality $\mathbb{E}[N_i] = e$ for $i\geq 2$ is not necessarily true anymore, meaning: $$N_2 =\min_{n\in\mathbb{N}}\Big\{{\sum_{i={N_1+1}}^{n}X_i\geq 1}\Big\} $$ but $\mathbb{E}[N_2] = e$ is necessarily true.

I would like to get some direction how to start. Since this is not a distribution with memorylessness, I don't have a direction to follow.

Avi P
  • 500
  • 1
    The answer can be found there: https://math.stackexchange.com/questions/3022309/expected-number-of-terms-needed-to-get-a-sum-greater-than-t-for-i-i-d-random Take the habit to look at the related problems mentioned on the right of the page. – Christophe Leuridan Aug 09 '22 at 19:36

0 Answers0