The Joriki method in that other link is interesting, I would have approached it differently (via Wald) and (hopefully?) that would also give an exact expression for the exponential case. For the problem at hand, a Wald-based approach is as follows:
Suppose that $\{Y_i\}_{i=1}^{\infty}$ are iid, nonnegative, and have finite expectation $E[Y]>0$. Define $N$ as the smallest time at which the sum of the $Y_i$ values meets or exceeds a positive threshold $C$, so that $\sum_{i=1}^N Y_i \geq C$. Define
$$Overshoot = \sum_{i=1}^N Y_i - C$$
Then $Overshoot \geq 0$ and
$$ E\left[\sum_{i=1}^N Y_i \right] = C + E[Overshoot] $$
On the other hand, we can write $\sum_{i=1}^N Y_i = \sum_{i=1}^{\infty} Y_iI_i$ for indicator functions $I_i=1\{i \leq N\}$, so the $I_i$ indicators filter out only those terms that are used. Then, by a Wald-type technique:
\begin{align}
E\left[\sum_{i=1}^N Y_i\right] &=E\left[\sum_{i=1}^{\infty}Y_iI_i\right]\\
&\overset{(a)}{=}\sum_{i=1}^{\infty}E\left[Y_iI_i \right]\\
&\overset{(b)}{=}\sum_{i=1}^{\infty}E[Y_i]E[I_i] \\
&=E[Y]\sum_{i=1}^{\infty}E[I_i]\\
&= E[Y]\sum_{i=1}^{\infty}P[i\leq N]\\
&\overset{(c)}{=}E[Y]E[N]
\end{align}
where (a) holds because we can always pass expectations through an infinite sum of non-negative random variables; (b) holds by the Wald observation that $I_i$ depends only on $Y_1, ... Y_{i-1}$ and so is independent of $Y_i$; (c) holds by the expectation identity for positive integer random variables $N$.
Equating these two different expressions for $E[\sum_{i=1}^NY_i]$ gives:
$$ E[N] = \frac{C + E[Overshoot]}{E[Y]} $$
This holds for any iid and nonnegative random variables $Y_i$ with $E[Y]>0$, which includes your case. You can compute $E[Y]$ for your case. The only difficulty is computing $E[Overshoot]$. But you could perhaps bound that above and/or below considering the worst-case distribution on the last step.
Using the above technique on the problem from the previous link you gave, I get a slightly different answer (I suspect that previous answer had a minor mistake because the constant $c$ used there should have depended on time, i.e., $c(t)$). That link was:
Average number of terms required in a sum of exponential variables to reach a specific limit
There, it looks like we have $Y_i = \max[X_i-t, 0]$ for $X_i$ exponential with rate $\lambda$. So:
$$ E[Y] = \int_{t}^{\infty} (x-t)\lambda e^{-\lambda x}dx = \frac{e^{-\lambda t}}{\lambda}$$
$$ E[Overshoot] = 1/\lambda $$
Then, using threshold $C=L$:
$$E[N] = \frac{L+E[Overshoot]}{E[Y]} = (L + 1/\lambda)\lambda e^{\lambda t} = L\lambda e^{\lambda t} + e^{\lambda t} $$
Edit: Joriki fixed his answer to indeed get the final "constant term" as $c=e^{\lambda t}$, consistent with my answer.