36

Let $ X_1, ... X_n $ a sample of independent random variables with uniform distribution $(0,$$ \theta $$ ) $ Find a $ $$ \widehat\theta $$ $ estimator for theta using the maximun estimator method more known as MLE

Daniel
  • 3,053
  • 1
    If you want to find the maximum likelihood estimate, you first need to derive the likelihood. Did you get that far? Here is a primer: http://en.wikipedia.org/wiki/Maximum_likelihood_estimator – Emre Jul 05 '11 at 04:57
  • 3
    You asked this question for the method of moments, but you wanted the MLE. I am assuming in that time you've come up with something... surely... what have you tried? What is your effort? I'll write something that will guide you, but I don't want to just write the solution. – mathmath8128 Jul 05 '11 at 04:59
  • The following video really helped me: https://www.youtube.com/watch?v=XaAtkCzdjLE – Dor Aug 31 '15 at 18:06
  • 2
    I see no reason why this question is off-topic. – Yaroslav Nikitenko Mar 30 '21 at 19:53

2 Answers2

72

First note that $f\left({\bf x}|\theta\right)=\frac{1}{\theta}$ , for $0\leq x\leq\theta$ and $0$ elsewhere.

Let $x_{\left(1\right)}\leq x_{\left(2\right)}\leq\cdots\leq x_{\left(n\right)}$ be the order statistics. Then it is easy to see that the likelihood function is given by $$L\left(\theta|{\bf x}\right) = \prod^n_{i=1}\frac{1}{\theta}=\theta^{-n}\,\,\,\,\,(*)$$ for $0\leq x_{(1)}$ and $\theta \geq x_{(n)}$ and $0$ elsewhere.
Now taking the derivative of the log Likelihood wrt $\theta$ gives:

$$\frac{\text{d}\ln L\left(\theta|{\bf x}\right)}{\text{d}\theta}=-\frac{n}{\theta}<0.$$ So we can say that $L\left(\theta|{\bf x}\right)=\theta^{-n}$ is a decreasing function for $\theta\geq x_{\left(n\right)}.$ Using this information and (*) we see that $L\left(\theta|{\bf x}\right)$ is maximized at $\theta=x_{\left(n\right)}.$ Hence the maximum likelihood estimator for $\theta$ is given by $$ \hat{\theta}=x_{\left(n\right)}.$$

Nana
  • 8,351
  • 1
    I think you forgot the d theta in the denominator. but good answer! :) – mathmath8128 Jul 05 '11 at 05:41
  • Thanks aengle...its fixed...:) – Nana Jul 05 '11 at 05:50
  • 3
    @Nana Very old question, but still. Isn't there a problem with endpoints of the given interval? If they were included you solution would be perfectly fine, but the are not. How do deal with it? – Caran-d'Ache Jun 04 '13 at 17:19
  • I have another queston, I got an unbiased $\frac{n+1}{n}X_(n)$, if given that $\theta$ is greater than 1, will the estimator be changed? – nanopotato Oct 12 '14 at 17:05
  • 5
    How is differentiating valid here?? – StubbornAtom May 25 '18 at 21:22
  • 1
    https://math.stackexchange.com/questions/649678/how-do-you-differentiate-the-likelihood-function-for-the-uniform-distribution-in?rq=1 – StubbornAtom Oct 05 '18 at 18:07
  • @Nana I think the derivative of the log likelihood in this particular case is in fact $- \frac{n^2}{\theta}$ if my calculations are correct. I think you forgot that: $ln(L(\theta | x)) = \sum_{k=1}^n ln((\frac{1}{\theta})^n) = \sum_{k=1}^n -n ln(\theta) = -n^2 ln(\theta)$. –  Jan 05 '20 at 06:58
11

This example is worked out in detail here (pages 13-14).

Shai Covo
  • 24,077