I have a sum $Y=\sum_{i=1}^{\infty}(X_i-t)u(X_i-t)$ where all $X_i's$ are i.i.d exponentially distributed random variables with parameter $\lambda$ and $t$ is a constant. I want to know how many term on average will be required in sum so that the sum is at least $L$. For this I can use following two strategies.
First Strategy:
I find the probability that the number of terms required are $k=1,2,3\cdots$ then I can use following formula $$\overline{K}=\sum_{k=1}^{\infty}k\times Pr(number \quad of \quad terms \quad required \quad are \quad k).$$
As this can be complex therefore I show another strategy below.
Second Strategy:
I find the average of individual variables (which in this case will be same because of i.i.d assumption) and then divide $L$ by the average. Hence my final answer will looklike $$\overline{K}=\frac{L}{\lambda}.$$ where $\lambda$ is average of $X_i-t$ given that $X_i>t$. I want to know if this strategy is right and if the answer $\overline{K}=\frac{L}{\lambda}$ is right.
Thanks in advance.