6

Suppose we have $T>0$, and $(X_n)_{n \in \mathbb N}$ is a collection of i.i.d. random variables that are uniformly distributed on $[0,1]$. Define the random variable: $$ N := \max \left\{ n \in \mathbb N_0 : S_n \leq T \right\} $$ where $S_n := X_1 + \cdots X_n$. I want to compute $\mathbb E[N]$.

What I've done: I know $\mathbb E[N] = \sum_{k=0}^\infty k\mathbb P[N=k]$, so the problem amounts to computing $\mathbb P[N = k]$. Assume $0 < T \leq 1$ (I think the problem gets complicated for $T > 1$, and I want to figure out the easy case first). We see that $$ \{N = k\} = \{S_k \leq T < S_{k+1}\}, $$ so the probability can be expressed in the iterated integral \begin{align*} \mathbb P[N=k] &= \int_0^1\cdots \int_0^1 \mathbb 1_{\{S_k \leq T\}}\mathbb 1_{\{T < S_{k+1}\}} \, dx_{k+1} \cdots dx_1 \\ &= \int_0^T \int_0^{T-S_1} \cdots \int_0^{T-S_k} \int_{T-S_k}^1 dx_{k+1} \,dx_k \cdots dx_2 \, dx_1. \end{align*} I was able to show by induction that this integral evaluates to $$ \mathbb P[N=k] = \frac{(-1)^k (1-T)^{k+1}}{(k+1)!} + \sum_{i=1}^k \frac{(-1)^{i-1}T^{k-i}}{i!(k-i+1)!} $$ but this solution for the probability is a serious mess, and applying $\mathbb E[N] = \sum_{k=0}^\infty k\mathbb P[N=k]$ doesn't seem to lead to anything nice.

My question. Is there anything standard that this sum converges to? Or anything similar? Something binomial, perhaps? Alternatively, is there an easier way to compute $\mathbb E[N]$? I saw on some posts that $$ \mathbb E[N] = \sum_{k=0}^\infty \mathbb P[N > k] $$ and these probabilities are much easier to compute and work out much more cleanly, but it's not clear to me why this is true.

Did
  • 279,727
D Ford
  • 3,977

3 Answers3

4

$S_n$ follows the Irwin–Hall distribution. Therefore,

$$ \mathsf{P}(N> n)=\mathsf{P}(S_{n+1}\le T)=\frac{1}{(n+1)!}\sum_{k=0}^{\lfloor T\rfloor}(-1)^k\binom{n+1}{k}(T-k)^{n+1} $$ and \begin{align} \mathsf{E}N &=\sum_{n\ge 1}\frac{1}{n!}\sum_{k=0}^{\lfloor T\rfloor}(-1)^k\binom{n}{k}(T-k)^n \\ &=\sum_{k=0}^{\lfloor T\rfloor}\frac{(-1)^k}{k!}\sum_{n\ge 1\vee k}\frac{(T-k)^n}{(n-k)!} \\ &=\sum_{k=0}^{\lfloor T\rfloor}\frac{(-1)^k}{k!}(T-k)^ke^{T-k}-1. \end{align}

In particular, for $T\in [0,1]$, $\mathsf{E}N=e^T-1$.

3

Alternatively, is there an easier way to compute $\mathbb E[N]$?

Indeed, there is. For every nonnegative $t$, consider $N_t=\inf\{ n : S_n>t\}$ then your $N$ is $N_T-1$ hence it suffices to compute every $E(N_t)$. Conditioning on $X_1$, one sees that, for $t<1$, $$E(N_t)=1+\int_0^tE(N_{t-s})ds=1+\int_0^tE(N_s)ds$$ while, for $t>1$, $$E(N_t)=1+\int_0^1E(N_{t-s})ds=1+\int_{t-1}^tE(N_s)ds$$ Thus, the function $n(t)=E(N_t)$ solves the differential equation $$n'(t)=n(t)$$ on $(0,1)$, and the delayed differential equation $$n'(t)=n(t)-n(t-1)$$ on $t>1$, with the initial condition $n(0)=1$. Alternatively, $m(t)=e^{-t}n(t)$ solves the differential equation $$m'(t)=0$$ on $(0,1)$, and the delayed differential equation $$m'(t)=-e^{-t}n(t-1)=-e^{-1}m(t-1)$$ on $t>1$, with the initial condition $m(0)=1$.

Thus, $m(t)=1$ on $(0,1)$, $m(t)=1-e^{-1}(t-1)$ on $(1,2)$, and one can deduce recursively similar formulas for $m(t)$, hence for $n(t)$, on each interval $(k,k+1)$ with $k$ a natural integer. In the end, on $(k,k+1)$, $m(t)$ is a polynomial of degree $k$, and $n(t)=e^tm(t)$. The result is related to the Irwin-Hall distribution.

The case of the interval $(0,1)$ is especially pleasing since $m(t)=1$ on $(0,1)$ hence one gets simply $$E(N_t)=e^t$$ and, in particular, the a priori surprising value $$E(N_1)=e$$

Did
  • 279,727
  • @d.k.o. Thanks, corrected. – Did Dec 02 '18 at 10:00
  • @Song Indeed, asymptotics when $t\to\infty$ are much easier than exat values for some fixed $t$ (but the correct statement is that $E(N_t)-(t/\mu)\to c$ when $t\to\infty$, for some explicit $c$ depending on $\mu$ and $\sigma^2$ which is probably explained on the WP page). – Did Dec 02 '18 at 10:15
  • I'm not seeing yet why $E(N_t) = 1+\int_0^t E(N_{t-s}) ds$. What do you mean by "conditioning on $X_1$?" – D Ford Dec 02 '18 at 15:06
  • Use the fact that, for every $s$ in $(0,t)$, on ${X_1=s}$, $N_t=1+N'{t-s}$ where $N'{t-s}\stackrel{d}=N_{t-s}$. – Did Dec 02 '18 at 15:08
2

The simpler version in the post you mentioned is based on the fact that the mean of all non-negative integer $\mathbb{E}[N]=\sum_{i=0}^{\infty}Np(N=k)=\sum_{i=1}^{\infty}Pr\{N\geq i\}$.(Refer to this question: Find the Mean for Non-Negative Integer-Valued Random Variable for the proof)

Take the simple case where $X_i\sim U(0,1)$ and $T\in [0,1]$ as an example: By induction, $Pr\{N\geq n+1\}=\frac{t^n}{n!}$ can be proved, so $$ \mathbb{E}(N)=\sum_{i=1}^\infty Pr\{N\geq i\}=\sum_{i=0}^\infty \frac{x^n}{n!}=e^t $$