1

Can someone provide an alternative explanation to why $e$ can be thought of as an infinite summation of the likelihood of picking any one arrangement out of $n$ arrangements?

$$\frac{1}{0!} + \frac{1}{1!} + \frac{1}{2!} + ... + \frac{1}{n!}$$

Though I am not averse to axiomatic arguments, I am seeking a "wordier" explanation or analogy; why should $e$ somehow be related to the idea of "the (summed) probability of choosing a particular arrangement of $n$ arrangements, as $n$ approaches infinity."

  • 4
  • Though I am not averse to axiomatic arguments, I am seeking a "wordier" explanation or analogy; why should e somehow be related to the idea of "the (summed) probability of choosing a particular arrangement of n arrangements, as n approaches infinity." – silian-rail Jan 08 '20 at 20:47
  • 2
    What is your definition of $e$? – saulspatz Jan 08 '20 at 20:51
  • $$\sum_{n=0}^\infty \frac{1}{n!}=e$$ – silian-rail Jan 08 '20 at 20:58
  • If that's the definition then it is just a matter of identifying $n!$ with the number of permutations of $n$ objects. If you use a different definition then there is a bit more to it. – Ian Jan 08 '20 at 21:02
  • @ian can you give me an example of one of the other definitions you might prefer? – silian-rail Jan 08 '20 at 21:06
  • 1
    The usual alternative is $\lim_{n \to \infty} (1+1/n)^n$, whose connection to $\sum_{n=0}^\infty \frac{1}{n!}$ is showed in the link in Maximilian's comment. – Ian Jan 08 '20 at 21:34
  • That's the usual 'combinatorial' definition, which is the one that makes the most sense in this context; there's also, e.g., 'the value of the function $f()$ satisfying $f(0)=1$, $f'(x)=f(x)$ at $x=1$', which is perhaps the canonical 'analytical' definition. These definitions do all define the same number, of course, but that still needs to be proven. – Steven Stadnicki Jan 08 '20 at 21:40
  • 2
    Probabilistically speaking, I actually rather like $e^{-1}=\lim_n(1-1/n)^n$, because the right hand side can be given a clean probabilistic interpretation: 'what is the probability that if we roll an $n$-sided die $n$ times, we will get no rolls of $1$?'. – Steven Stadnicki Jan 08 '20 at 21:42
  • @steven's $e^{-1}$ comment is what I'm after; just additional ways, analogies, or metaphors to conceptualize $e$. This line of contemplation might be called the 3b1b method. – silian-rail Jan 08 '20 at 21:47
  • "the likelihood of picking any one arrangement out of n arrangements" what exactly do you mean by that? ... if by definition $e = \sum \frac 1{n!}$ then and an there is $n!$ arrangements of $n$ objects (you know why that is, don't you?) then $\sum $ inverses of arrangements = $\sum \frac 1{n!} = e$. What then, actually is your question? How does this property fit with other expected properties of $e$? Well, what other properties of $e$ do you have in mind? – fleablood Jan 08 '20 at 21:53
  • @steven Where did you come across this particular conceptualization? A book or article, perhaps? Or invented just now? – silian-rail Jan 08 '20 at 21:55
  • 1
    I think perhaps a better way of phrasing this question would be 'what are some probabilistic interpretations of $e$ or $e^{-1}$, and how can I show the linkage between them?' — correct me if I'm wrong, but it sounds like that's roughly what you're trying to get at here? For instance, $e$ can also be thought of as an 'average stopping time' for various processes — 'if I keep doing this thing until some condition happens, how many times on average will I do it?'. – Steven Stadnicki Jan 08 '20 at 21:57
  • 1
    @pgayed It was actually a practical matter for me for several years; I was a game developer up until just recently, and knowing things like 'if the player takes 20 tries at a 1-in-20 thing, they have about a 1/3 chance of not succeeding' is pragmatically useful there. – Steven Stadnicki Jan 08 '20 at 21:58
  • @steven Very interesting. Thank you! And, as for the goal of the question, it's not so much I'm seeking a particular explanation but the many ways people might explain the concept ("intuitively" or by other means). I know SE is focused on providing "terminating answers" but because there are so many smart and curious people here, I've always found it to be fertile ground to plant an open-ended question. Thanks for taking the time. I appreciate the $n$-sided die way of seeing it! – silian-rail Jan 08 '20 at 22:07
  • 1
    Although ambiguously worded the question might be "Why does $e$ have all the properties that it does". Why is 1) the only value of $b>0;b\ne 1$ where the rate of change of $b^x$ or in other words $\lim_{t\to 0}\frac{b^{x+t}-b^x}t$, what that is always equal $b^x$ itself; why is the only value where that is true equal to 2) $\lim_{n\to \infty}(1+\frac 1n)^n$ and why is that number equal to 3) $\sum_{k=0}\frac 1{k!}$. Why should those three different concepts have the same value? And a basic first year calculus course will cover those. – fleablood Jan 08 '20 at 22:10
  • 1
    .... a fourth rather meaningless question would be why are all of those numbers equal to an irrational number somewhere between $2.7182$ and $2.7183$. But the answer to that fourth question is just.... why not? – fleablood Jan 08 '20 at 22:13
  • @fleablood Yes! I think I see that what you're saying is that $e$ "naturally" arises from definitions of calculus. I will have to revisit my old calculus textbook! Thank you. – silian-rail Jan 08 '20 at 22:19
  • 1
    If $f(x) = b^x$ and $b\ne 1$ then $f'(x)=\lim\frac {b^{x+h}+b^x}h=\lim\frac {b^x(b^h+1)}h=b^x\cdot \lim \frac{b^h}h$. Go to the calculus book where $e$ is defined and where $\ln x$ and $b^x$ are defined and how $f'(x)=b^x\lim \frac{b^h}h=b^x\ln b$ is determined and why $e$ is the real number so that $\frac {de^x}{dx} = e^x$. Then skip ahead a few chapters to Taylor series and how $f(x) = \sum \frac{f'^{k}(x-a)^k}{k!}$ and apply $f(x)=e^x$ so $f'^{k}x = e^x$ and $a=0$ and $x =1$ to get $e^1 = \sum\frac 1{k!}$. – fleablood Jan 08 '20 at 22:44

2 Answers2

1

To answer the question I suggested in the comments(!), here are a few probabilistic interpretations for $e$ and $e^{-1}$.

The two canonical probabilistic interpretations for $e^{-1}$ that I tend to see are the question of 'meeting expectations' and the derangements problem. The meeting expectations question is simple:

If I have a one-in-$n$ probability of doing something and I try it (independently) $n$ times, what's the chance that none of them succeeds?'

Since the probability of success is $1/n$, the probability of failure is $(1-1/n)$, and since the tries are independent the probability they all fail is $(1-1/n)^n$; as $n$ gets larger and larger, then, this approaches $e^{-1}$.

The derangements problem is similar but subtly different:

If I take $n$ different hats off of $n$ people, shuffle them randomly, and redistribute them, what's the probability that nobody gets their own hat back?.

Here it can be shown by various means (for instance, inclusion-exclusion) that this probability is $1-1/(1!)+1/(2!)-1/(3!)+1/(4!)+\ldots+(-1)^{n}/(n!)$, and so once again the limit as we approach infinitely many hats is $e^{-1}$.

While $e$ itself can't be interpreted as a probability per se (since it's larger than 1), it can be interpreted as a stopping time or expected value for a process. For instance, suppose that we follow this process:

Start with a deck of one card (an ace). Shuffle it randomly, and see whether the top card after it's shuffled is the ace. If so, add a different card (a deuce) to the deck. Repeat until we get a shuffle that doesn't have the ace at the top.

Then we can ask about how many cards the deck will have in it when we stop, on average. Now, the probability that we stop after $n$ steps is the probability that we've succeeded on all the previous steps ($1\cdot\frac12\cdot\frac13\ldots\cdot\frac1{n-1} = \frac1{(n-1)!}$) times the probability that we fail this time ($\frac{n-1}n$), or in other words $\frac1{n\cdot(n-2)!}$. (This requires $n\geq 2$, but it's clear to see that the probability we stop after the first shuffle is zero.) Since the number of cards in the deck then is $n$, we can compute the expectation of the number by multiplying that count by the stopping probability, and adding over all of the possible choices: $\sum_{n\geq 2}n\cdot\frac1{n\cdot(n-2)!}$ $=\sum_{n\geq 2}\frac1{(n-2)!}$ $=\sum_{m\geq 0}\frac1{m!}$ $=e$.

1

Not a proof but a handwaving argument:

If $f(x) = b^x$ (assume $b>0; b\ne 1$) then $f'(x)=\frac {b^{x+h}-b^h}h =b^x\lim \frac {b^h}{h}$.

$e$ can either be defined or derived to be the number so that if $f(x) =e^x$ then $f'(x) =e^x =f(x)$. (this also implies $\lim \frac {b^n}h = \ln x$.

(In actual fact this is a bit backwards as most calculus course define $\ln x := \int_1^x \frac 1t dt$ and $e^x$ as the inverse of that and $b^x = e^{x\ln b}$ and then prove for rational numbers it works exactly how we expect exponents to work and that $e^1 = \lim{1+\frac 1n}^n := e$. Intuitively completely backwards-- but rigorously sound.)

Now we have Newton approximation is that

We can estimate $f(x)$ wear $x$ is just slightly larger than $a$ as $f(x) = f(a)+ f'(a)(x-a)$. This makes sense if $f'(a)$ is considered a slope of a tangent line to the graph. THis just shows how the line would continue to shoot

If $f(x)$ is infinitely differentiable (which $e^x =(e^x)' = (e^x)''= ....$ has to be) then we can recursively and inductively apply Newton's approximation and create Taylor series.

For any $a$ the $f(x) = f(a) + \frac {f'(a)}{1!}(x-a) + \frac {f''(a)}{2!}(x-a)^2 + .....$.

And if we let $f(x) = e^x$ and $a = 0$ then $f^{k}(x) = e^x$ so $e^x = e^0 + \frac {e^0}{1!}(x-0) + \frac {e^0}{2!}(x-0)^2+...= 1 + \frac 1{1!}x + \frac 1{2!}x^2 + \frac 1{3!}x^3 +.....$

And if we take $x = 1$ we have $e =e^1 = \frac 1{0!} + \frac 1{1!} + \frac 1{2!} + .....$

fleablood
  • 124,253