Problem: Let $X_1,...,X_n$ be a sample from a distribution with CDF
$$F(x)=1-\frac{1}{(1+x)^\mu}, \quad x>0, \quad \mu > 0.$$
Find the maximum likelihood $\hat{\mu}$ of $\mu.$ Also, determine if this estimator is unbaiased.
I have that
$$\hat{\mu}=\frac{n}{\sum_{k=1}^n\ln(1+X_k)}.$$
Which is correct from the given problem. In order for $\hat{\mu}$ to be an unbaiased estimator then it has to be the case that $E[\hat{\mu}]=\mu.$ The solution says we can do the following:
It is not an unbaiased estimator. To show this, we can let $n=\mu=1.$ We then have
$$E[\hat{\mu}]\int_0^{\infty}P(X>x)\ dx=\int_0^{\infty}\frac{1}{1+x} \ dx=\infty\neq\mu.$$
But I thought that
$$E[\hat{\mu}]=E\left[\frac{1}{\sum_{k=1}^1\ln(1+X_1)}\right]=E\left[\frac{1}{\ln{(1+X_1)}}\right]=\int_0^{\infty}\frac{1}{\ln{(1+x)}}\cdot\frac{1}{x+1} \ dx, $$
in accordance with LOTUS? Where does he get $P(X>x)$ all of a sudden?