Suppose we have $T>0$, and $(X_n)_{n \in \mathbb N}$ is a collection of i.i.d. random variables that are uniformly distributed on $[0,1]$. Define the random variable: $$ N := \max \left\{ n \in \mathbb N_0 : S_n \leq T \right\} $$ where $S_n := X_1 + \cdots X_n$. I want to compute $\mathbb E[N]$.
What I've done: I know $\mathbb E[N] = \sum_{k=0}^\infty k\mathbb P[N=k]$, so the problem amounts to computing $\mathbb P[N = k]$. Assume $0 < T \leq 1$ (I think the problem gets complicated for $T > 1$, and I want to figure out the easy case first). We see that $$ \{N = k\} = \{S_k \leq T < S_{k+1}\}, $$ so the probability can be expressed in the iterated integral \begin{align*} \mathbb P[N=k] &= \int_0^1\cdots \int_0^1 \mathbb 1_{\{S_k \leq T\}}\mathbb 1_{\{T < S_{k+1}\}} \, dx_{k+1} \cdots dx_1 \\ &= \int_0^T \int_0^{T-S_1} \cdots \int_0^{T-S_k} \int_{T-S_k}^1 dx_{k+1} \,dx_k \cdots dx_2 \, dx_1. \end{align*} I was able to show by induction that this integral evaluates to $$ \mathbb P[N=k] = \frac{(-1)^k (1-T)^{k+1}}{(k+1)!} + \sum_{i=1}^k \frac{(-1)^{i-1}T^{k-i}}{i!(k-i+1)!} $$ but this solution for the probability is a serious mess, and applying $\mathbb E[N] = \sum_{k=0}^\infty k\mathbb P[N=k]$ doesn't seem to lead to anything nice.
My question. Is there anything standard that this sum converges to? Or anything similar? Something binomial, perhaps? Alternatively, is there an easier way to compute $\mathbb E[N]$? I saw on some posts that $$ \mathbb E[N] = \sum_{k=0}^\infty \mathbb P[N > k] $$ and these probabilities are much easier to compute and work out much more cleanly, but it's not clear to me why this is true.