6

I came across the following question, whose answer is $e$. I was sort of amazed, since I didn't see a reason why $e$ should be making an appearance. So, phrasing the main question in the title another way: should it have been possible to predict that at least my answer should involve $e$ in some nontrivial way?

Question: Suppose you draw random variables $x_i \in [0,1]$, where the $x_i$ are uniformly distributed. If $x_i > x_{i-1}$, we draw again, and otherwise, we stop. What's the expected number of draws before we stop?

Solution: The probability that we take $n$ draws is given by $\frac{n-1}{n!}$ since there are $n!$ ways to order $x_1,\ldots, x_n$, and exactly $n-1$ choices for the placement of $x_{n-1}$ in the ordering $x_1 < x_1 < \cdots < x_{\widehat{n-1}} < x_n$. That is, for it to take precisely $n$ draws, we need $x_1 < x_2, x_2 < x_3, \ldots, x_{n-2} < x_{n-1}$ but then $x_n < x_{n-1}$. Thus, the expected value is given by $$ E(\text{number of draws}) = \sum_{n = 2}^\infty \frac{1}{(n-2)!} = \fbox{e} $$

P.S. It's also possible I simply made a mistake, and if that's the case please point it out and I can edit or delete the question accordingly.

Derek Allums
  • 3,121

2 Answers2

3

First: I would take a slightly different route, but that's a matter of taste: Use that $$ E[X]=\sum_{n=0}^\infty nP(X=n)=\sum_{n=0}^\infty P(X> n).$$ Similar to your reasoning, $X> n$ occurs iff (up to almost impossible equalities) $x_1<x_2<\ldots <x_n$, which happens with probability $\frac 1{n!}$. Hence, (again) $$\tag1 E[X]=\sum_{n=0}^\infty\frac1{n!}=e.$$

But regarding your main question: Could we expect $e$ to raise its head? Well, perhaps. The problem is about random order, hence about permutations, and (hand-wave) $e$ very often occurs in the context of permutations - which is of course owed to it being the sum if reciprocals of factorials. Then again, before one follows this thought further, one has already written down $(1)$ ...

3

I'll give a different way of approaching the problem that shows another way that $e$ can arise.

As with Hagen von Eitzen's answer, let $X$ be the random variable that represents the number of draws.

Consider the closely related problem where $x_i \in \{1, 2, \ldots, N\}$ instead. At most $N + 1$ draws can happen in this case. We can calculate the expectation using the binomial theorem.

$$E[X] = \sum_{k=0}^N{P(X > k)} =\sum_{k=0}^{N}\frac{N\choose k}{N^k} = \left(1 + \frac1{N}\right)^N$$

Here, $N \choose k$ is the number of strictly increasing ways of choosing $x_1, \ldots, x_k$ which is then divided by the total number of ways of choosing them to get the probability.

Of course then, the expectation approaches $e$ in the limit as $N \to \infty$. (I don't think this is a particularly good way to solve the problem, as one still has to justify how this limit is applicable, but hopefully it provides some insight into the appearance of $e$ regardless.)

As for the fundamental question, I don't really see a way of knowing that $e$ will appear until it has more or less appeared, but $e$ does seem to arise here quite naturally, so maybe it should not be so surprising in general when $e$ arises in a combinatorial context.

Derek Allums
  • 3,121
tehtmi
  • 1,063
  • Very cool - and I suppose this further illustrates the point that I shouldn't be surprised by the appearance of $e$, if my definition of $e$ is, for example, $\lim_{n \to \infty} \left(1+\frac{1}{n}\right)^n$. – Derek Allums Jul 26 '17 at 13:04