With the help from Ian and lulu, I try to answer my own question. After searching for this idea, I realize it is called random bits. For example, if the question is to get the probability of $1/2^2$ instead of $1/e$, then one can toss a fair coin $n$ times to get $X_i$ with $i=1,\dots,n$ and compute $\sum_{i=1}^n X_i /2^i$ for $n=2$. The outcomes with equal probability will be $\{0,0.25,0.50,0.75\}$. The expected number of tosses is $4$. So the question turns out to be using $\sum_{i=1}^n X_i /2^i$ to approximate $1/e$.
As Ian pointed out that the generating process returns the outcome once we know whether the number will be in $[0,e^{−1})$ or $[e^{−1},1]$. This works because after $n$ flips have been done, we will know the number won't change more than $2^{−(n+1)}$ by adding extra terms.
In terms of the expected number of tosses, if we set $E[\sum_{i=1}^n X_i /2^i]=1/e$ or $\sum_{i=1}^n 1/2^{i+1}=1/e$, we can solve for $n=-\log_2(1-2/e)$. Please correct me if I am wrong. Thanks~