1

Let $X_1, X_2, \ldots , X_n$ be a random sample from a distribution with the following pdf

$$f(x|\theta) = \begin{cases} 1/(\theta_2−\theta_1), &\quad\text {for}\quad \theta_1 \leq x\leq \theta_2\\ 0 &\quad\text { otherwise}\quad \end{cases}$$ Suppose that $\theta_1$ and $\theta_2$ are unknown.

How would I go about writing the likelihood function for this distribution on $\theta_1$ and $\theta_2$.

2 Answers2

2

The easiest way might be to begin by writing the density as it should be written, that is, as $$ f(x\mid\theta)=\frac{\mathbf 1_{\theta_1\leqslant x\leqslant\theta_2}}{\theta_2-\theta_1}, $$ where $\theta=(\theta_1,\theta_2)$ with $\theta_1\lt\theta_2$. Then the likelihood of an i.i.d. sample $\mathbf x=(x_1,\ldots,x_n)$ is $$ f(\mathbf x\mid\theta)=\prod_{k=1}^nf(x_k\mid\theta)=\frac{\mathbf 1_{\theta_1\leqslant m_n(\mathbf x),s_n(\mathbf x)\leqslant\theta_2}}{(\theta_2-\theta_1)^n}, $$ where $$ m_n(\mathbf x)=\min\{x_k\mid 1\leqslant k\leqslant n\}, \qquad s_n(\mathbf x)=\max\{x_k\mid 1\leqslant k\leqslant n\}. $$ For every fixed $\mathbf x$, $f(\mathbf x\mid\theta)$ is maximal when $\theta_2-\theta_1$ is as small as possible hence the MLE for $\theta=(\theta_1,\theta_2)$ based on $\mathbf x$ is $$ \widehat\theta(\mathbf x)=(m_n(\mathbf x),s_n(\mathbf x)). $$

Did
  • 279,727
  • would the max be the MLE for θ2. – cheeseman123 Nov 10 '12 at 17:59
  • See Edit. $ $ $ $ – Did Nov 10 '12 at 18:28
  • I think I got you. Just to confirm in my notation, would it be then valid write that the MLE for $\theta_1$ is $\widehat\theta_1 = \max{x_k\mid 1\leqslant k\leqslant n}. $ – cheeseman123 Nov 10 '12 at 18:38
  • In general, the MLE is an estimator of the full parameter of the distribution, here $\theta$. The notion of MLE for a sub-parameter such as $\theta_1$, is not well defined. What one can say though, is that the $\theta_1$ part of the MLE is the minimum (and certainly not the maximum) of the sample. An additional subtlety here is that for every fixed $\theta_2$, the same value of $\theta_1$ yields the maximum (restricted) likelihood. – Did Nov 10 '12 at 19:16
  • I'm sorry for prolonging this. I copy and pasted the wrong bit of what you typed, as I was checking how do you type it the way you do; I meant min instead of max. So, MLE for $ \theta_1 $ is $ \widehat\theta_1 = \min{x_k\mid 1\leqslant k\leqslant n} $

    On the next point, I don't think, $ \theta_2 = \theta_1 $

    – cheeseman123 Nov 10 '12 at 22:28
  • ... On the next point, I don't think $ \theta_2 = \theta_1 $ is valid, but I've probably misunderstood what you wrote so if you don't mind, just explain that bit again. – cheeseman123 Nov 10 '12 at 22:37
  • ?? How one can extract from what I wrote anything resembling the assertion that $\theta_2=\theta_1$, in any sense, is beyond me. – Did Nov 10 '12 at 22:40
  • I'm confused about the following: "for every fixed $ \theta_2 $, the same value of $ \theta_1 $ yields the maximum (restricted) likelihood." – cheeseman123 Nov 10 '12 at 22:49
0

$$ L(\theta_1,\theta_2) = \begin{cases} \frac{1}{(\theta_2-\theta_1)^n} & \text{for }\theta_2\ge\max\text{ and }\theta_1 \le\min \\[10pt] 0 & \text{for other values of }(\theta_1,\theta_2) \end{cases} $$ where $\max=\max\{X_1,\ldots,X_n\}$ and $\min=\min\{X_1,\ldots,X_n\}$.

Draw the picture in the $(\theta_1,\theta_2)$ plane.