Suppose you are sampling from a probability distribution which contains a reciprocal:
$$f(x; \alpha, \lambda) = \begin{cases}\frac{\lambda \alpha^\lambda}{x^{\lambda+1}}~~x\geq \alpha;\\ 0 ~~~~~~\text{otherwise};\end{cases}$$
You can't derive the MLE by minimising the likelihood. Instead you have to use a "trick" and realise that since the function is always decreasing, the likelihood is maximised at $Y = \min(X). $
Is there a trick to work out the expectation $E[Y]$?
I tried to calculate and integrate the PDF using an approach similar to here:
$$F_y(x) = \begin{cases} 0 ~~~~~~~~~~~x\lt\alpha; \\ \frac{\lambda^n \alpha^{n\lambda}}{x^{n(\lambda+1)}} ~~~~x\ge\alpha; \end{cases}$$
$$f_y(x) = \begin{cases} 0 ~~~~~~~~~~~x\lt\alpha; \\ \frac{-\lambda^n \alpha^{n\lambda}n(\lambda+1)} {x^{n(\lambda+1)+1}} ~~~~x\ge\alpha; \end{cases}$$
Integrating:
$$ E[Y] = \int_\alpha^\infty xf_y(x)dx = \frac{\lambda^n \alpha^{1-n}n(\lambda+1)} {n(\lambda+1)-1} $$
but I was expecting to get an unbiased estimate of $\alpha$ (or at least close to it).