4

The first part of a question I am trying to solve asked to find the maximum likelihood estimator for $\theta$ for a pdf $f_X(x)=\frac{2x}{\theta^2}$, $0 < x \le \theta$ , $0$ otherwise. ($X_1, X_2, X_3, \ldots , X_n$ are independent and iid)

What I've got so far is as follows:

$$L(\theta) = \begin{cases} \dfrac{2^n \prod(X_i)}{\theta^{2n}}, & \max X_i < \theta, \\[8pt] 0, & \max X_i > \theta \end{cases} $$

(Sorry I could not figure out how to get it in one bracket)

Since $\frac{2^n \prod(X_i)}{\theta^{2n}}$ is decreasing as a function of $\theta$, $L(\theta)$ is maximized at $\theta = \max (X_i)$ , i.e., $\hat{\theta} = \max(X_i)$

$$F_{\hat\theta}(x) = \left(\frac{x^2}{\theta^2}\right)^n,\mbox{ for } 0 < x \le \theta. $$

$$E[\hat{\theta}] = \int_0^\theta x\,2n \frac{x^{2n}}{\theta^{2n}} \, dx = \frac{2n}{2n+1} \theta.$$

Now I would like to find the CR lower bound for any unbiased estimator in the above problem. I believe I understand the theory of the CRLB, ie., what it is doing, but am having trouble with its application in this problem.

samp1920
  • 139
  • Your piecewise expression has $<0$ and $>0$ where it should have $<\theta$ and $>\theta$. ${}\qquad{}$ – Michael Hardy Aug 17 '15 at 15:09
  • You also have $f_x(x)$ and $F_x(x)$ where you need $f_X(x)$ and $F_X(x)$. The distinction between $X$ and $x$ exists for an obvious reason and without it you will get confused. ${}\qquad{}$ – Michael Hardy Aug 17 '15 at 15:12
  • @MichaelHardy Thank you for pointing out. I have made those edits. – samp1920 Aug 17 '15 at 15:19
  • Your last paragraph abruptly changes the subject. Before your last paragraph you said nothing about unbiased estimators but found the (biased) MLE. ${}\qquad{}$ – Michael Hardy Aug 17 '15 at 16:03
  • CRLB is not applicable here as the support of the distribution depends on the parameter of interest. – StubbornAtom Apr 28 '21 at 16:12

1 Answers1

1

Assuming $X_i < \theta$ for all $i$, the log-likelihood is $$ n \log 2 + \sum \log (X_i) -2n \log \theta$$

The Fisher information is expectation of the square of the derivative (wrt $\theta$) of this quantity, i.e.

$$\mathbb E \left[ \left. \left( \frac{2n}{\theta} \right)^2 \right| \theta \right] = \frac{1}{n} \frac{4n^2}{\theta^2}$$

Then the C-R lower bound says that the MSE of any unbiased estimator $\hat\theta$ is at least as large as the reciprocal of this, i.e. $$\mathrm{Var}(\hat\theta) \geq \frac{\theta^2}{4n}$$

With your unbiased estimator, if you want to find the MSE, you need to do a multiple integral ($n$ times), which has a $\max$ function involved, which, if I'm not mistaken, is difficult, if not impossible to do analytically. Hence the next best thing you can do is find the C-R lower bound instead, which is the point of this exercise.

(Otherwise, in general, if you compute and find that your MSE is equal to the C-R bound, then you know that the estimator you have is the 'best' in the sense of minimising the MSE.)

Ken Wei
  • 1,759