0

I've understood that we use maximum/minimum of x's as MLE of theta. But no one so far has explained the reason why differentiation won't work. Please explain.

  • For the same reason we cannot use calculus for $U(0,\theta)$ and in general, when support of the distribution depends on the parameter of interest. See also https://math.stackexchange.com/questions/649678/how-do-you-differentiate-the-likelihood-function-for-the-uniform-distribution-in. – StubbornAtom Oct 06 '18 at 06:54
  • We can use calculus. In elementary math one learns to look at critical points which include the zeros of the first derivative, if they exist, but not restricted to them. – Math-fun Oct 07 '18 at 07:46

1 Answers1

2

Differentiation won't work because the likelihood function, as a function of the parameter $\theta$, doesn't achieve its maximum where its derivative is zero.

The density function of the Uniform$[-\theta,\theta]$ distribution is $$ f_\theta(x) =\frac1{2\theta}I(|x|\le\theta) $$ so the likelihood function for a sample $x_1,x_2,\ldots,x_n$ is $$ L(\theta)=\prod_{i=1}^nf_\theta(x_i)=\frac1{(2\theta)^n}\prod_{i=1}^n I(|x_i|\le\theta) =\frac1{(2\theta)^n}I(\theta \ge\max_i |x_i|). $$ The likelihood is a function of $\theta$ which is nonzero on the interval $I:=[\max_i|x_i|,\infty)$. Over this interval the function is strictly decreasing, so its derivative is never zero there. Outside of $I$ the function is zero so its maximum doesn't occur there. Thus calculus is no help in finding the maximum of $L$. On the other hand, since $L$ is nonzero and decreasing over $I$, it achieves its maximum at the left endpoint of $I$: $\hat\theta:=\max_i |x_i|$ and this is the MLE of $\theta$.

grand_chat
  • 38,951