1

Let $X_1, ...,X_n$ iid with distribution $f(x) = \theta x^{ \theta - 1}$, where $0 < x < 1$, $\theta > 0$. Let $Y_i = - \log (X_i)$.

Then it can be shown that $Y_i \sim \text{ Exp} (\theta)$. Further (by independence): $W = \sum_{i = 1}^{n} Y_i \sim \text{ Gamma}(n, \theta)$.

I also already calculated $\hat{\theta} = \frac{-n}{\sum_{i=1}^n \log (X_i)}$

How can I use the expression of the moment

$\mathbb{E} [ W^k ] = \frac{ (n+k-1)! }{\theta^k (n-1)!} $, $k > -n$

to obtain an unbiased estimator of $\theta$ of the form $\bar{\theta} = c \hat{\theta}$ for a constant $c$.

My idea was:

$\mathbb{E} [\hat{\theta}] = n \cdot \mathbb{E} [ W^{-1} ] = ... = \frac{ \theta \cdot n}{(n-1)}$, implying that $c = \frac{(n-1)}{n}$.

Reb2000
  • 55
  • 4

1 Answers1

0

Your approach as well as answer is perfectly correct. Note that we just need $E(W^{-1})$, for some cases where you don't have expression for the moment. You can try to compute the Expected value of particular moment which we need. We can also show that the ML estimate is biased from Jensen's Inequality Directly.

Shiv Tavker
  • 1,612
  • 7
  • 19