1

Suppose that $Y_1,\dots,Y_n$ is a random sample from the density function given by

$$f(y|\theta)=\begin{cases}\frac1\theta, &y\in(0,\theta), \\ 0, &\mathrm{otherwise.}\end{cases}$$

for some $\theta>0$. Let $\hat\theta$ be the Maximum Likekihood Estimator for $\theta$. Which of the following statements hold true?

  • (A) $\hat\theta=(y_1+\dots+y_n)/n$.
  • (B) $\hat\theta=\min_{i=1,\dots,n} y_i$.
  • (C) $\hat\theta=\max_{i=1,\dots,n} y_i$.
  • (D) $\hat\theta=1/\bar\theta$.
  • (E) None of the above.

The solution says C is correct and gives a brief explanation (plot the likelihood). I tried to work it out by the following method:

L(y|$\theta$) = $\frac{1}{\theta^n} = \theta^{-n}$. Taking logarithms gives -nlog$\theta$ and differentiating this wrt theta gives $\frac{-n}{\theta}=0$.

Can my method be used, if not, why? How should I go about getting the correct answer?

1 Answers1

1

You have to maximize $L$, or equivalently minimize $\theta$. You have no other condition from the expression of $L$, so any value of $\theta$ is valid, granted it's possible given the data.

The least possible value is obviously $\max y_i$, since any smaller value would be impossible: you can't get random samples outside of $[0,\theta]$.

Note that maximizing $L$ does not necessarily mean you have to find a zero of its derivative. This remark applies to all optimization problems. It's not uncommon that the optimum is found on some kind of boundary instead of a critical point.