3

Let $\theta\gt 0$ and $X_1, ..., X_n$ be independently and identically distributed with probability densitiy function $f_\theta(x) = \frac{1}{2\theta} \chi_{x\in[-\theta,\theta]}$, where $\chi$ is the indicator function. What is the maximum likelihood estimator for $\theta$?

This came up while studying for an exam and I would like some verification on my work:

I compute the product likelihood function first: $p_x(\theta)=\Pi_{i=1}^nf_\theta(x_i)=\frac{1}{(2\theta)^n}\Pi_{i=1}^n\chi_{x_i\in[-\theta,\theta]}=\frac{1}{(2\theta)^n}\chi_{\theta\ge \max|x_i|}$.
Now in the case of $\max|x_i|=0$ the function has no maximizer. This can be ignored since the case has probability $0$.
On the other hand if $\max|x_i|\gt0$ we see that $p$ is strictly decreasing in $\theta$ and therefore $\hat\theta=\max|x_i|$ is the unique maximizer and therefore the wanted estimator.

blst
  • 1,381

1 Answers1

1

Your answer is correct, and it is simple to verify using ordinary calculus methods. It is usually simpler to express this with the log-likelihood rather than the likelihood, since it is easier to differentiate the former. Given $n$ IID observations from this distribution, the log-likelihood function is:

$$\ell_\boldsymbol{x}(\theta) = -n \ln(2 \theta) \quad \quad \text{for }\max |x_i| \leqslant \theta.$$

The corresponding score function is:

$$\frac{d \ell_\boldsymbol{x}}{d \theta}(\theta) = - \frac{n}{ \theta} \quad \quad \text{for }\max |x_i| <\theta.$$

Since this is strictly negative for all $\theta >0$, the log-likelihood is a strictly decreasing function of $\theta$ and so the maximising value occurs at the smallest value, which is $\hat{\theta} = \max |x_i|$. It can be shown that this estimator is biased, with $\mathbb{E}(\hat{\theta}) = \tfrac{n}{n+1} \cdot \theta$. Hence, it is usual to impose a bias-correction on the MLE to get the estimator $\tilde{\theta} = \tfrac{n+1}{n} \cdot \max |x_i|$.

Ben
  • 4,079