Update: For fixed $n$, the function $x\mapsto 1-(1-q^x)^n$ is decreasing in $x>0$, and
therefore
$$\epsilon_n\geq \int^\infty_0 1-(1-q^x)^n\,dx\geq \epsilon_n-1.$$
Now, with a change of variables $x=w \log(n)$, the integral becomes
$$\int^\infty_0 1-(1-q^x)^n\,dx=\log(n) \int_0^\infty 1-\left(1-{1\over n^{w \log(1/q)}}\right)^n\,dw.$$
The integrand converges to 1 if $w < 1/\log(1/q)$ and zero if $w > 1/\log(1/q)$.
By dominated convergence
$$\int_0^\infty 1-\left(1-{1\over n^{w \log(1/q)}}\right)^n\,dw\to1/\log(1/q),$$
and we deduce that asymptotically
$$\epsilon_n\approx \log(n)/\log(1/q).$$
Start with $n$ independent coins with probability $p$ of showing heads.
Toss them all, and put aside those that show "heads". Retoss the remaining
coins, and repeat until all coins show "heads".
Your $\epsilon_n$ is the expected number of trials in this experiment.
Here is an explicit formula:
$$\epsilon_n=\sum_{j\geq 1}{n\choose j}(1-q^j)^{-1}(-1)^{(j+1)}=\sum_{k\geq 0}[1-(1-q^k)^n], $$
where $q=1-p$. Perhaps the asymptotics can be derived from this expression.
In the reference given below the author says "A graph of the data in TABLE 2
strongly suggests that E[Y] (i.e., your $\epsilon_n$) is a logarithmic function of $n$."
But no proof is offered.
Tossing Coins Until All Are Heads
by John Kinney in
Mathematics Magazine, Vol. 51, No. 3 (May, 1978), pp. 184-186.