Let $X_1, ...,X_n$ iid with distribution $f(x) = \theta x^{ \theta - 1}$, where $0 < x < 1$, $\theta > 0$. Let $Y_i = - \log (X_i)$.
Then it can be shown that $Y_i \sim \text{ Exp} (\theta)$. Further (by independence): $W = \sum_{i = 1}^{n} Y_i \sim \text{ Gamma}(n, \theta)$.
I also already calculated $\hat{\theta} = \frac{-n}{\sum_{i=1}^n \log (X_i)}$
How can I use the expression of the moment
$\mathbb{E} [ W^k ] = \frac{ (n+k-1)! }{\theta^k (n-1)!} $, $k > -n$
to obtain an unbiased estimator of $\theta$ of the form $\bar{\theta} = c \hat{\theta}$ for a constant $c$.
My idea was:
$\mathbb{E} [\hat{\theta}] = n \cdot \mathbb{E} [ W^{-1} ] = ... = \frac{ \theta \cdot n}{(n-1)}$, implying that $c = \frac{(n-1)}{n}$.