0

Question:

Let $X_1,...,X_n$ be a sample with distribution

$p_{\theta}(x)=\theta x^{-2}$, for $x \geq \theta$

and $0$ for $x < \theta$ with $\theta > 0$ unknown.

Determine the maximum likelihood-estimator of $\theta$.

Answer:

We can do this with the log-likelihoodfunction which we take the derivative of and set equal to $0$ in order to find a maximum. By doing this we get

$\frac{n}{\theta} = 0$

I have no idea where to go from here since this doesn't have a solution for $\theta$.

StubbornAtom
  • 17,052
Vydai
  • 151
  • 1
    Some funny stuff happens because your parameter defines the range. https://en.wikipedia.org/wiki/Maximum_likelihood_estimation#Data_boundary_parameter-dependent – T.J. Gaffney Nov 20 '16 at 23:46
  • This means it has no maximum? The answers say that $X_1$ is what we are looking for. I have no clue why though. – Vydai Nov 20 '16 at 23:49
  • I think you should look for the minimum of $X_i$, rather than $X_1$ – Med Nov 20 '16 at 23:55
  • The solution set probably mentions $X_{(1)}$, not $X_1$. – Did Nov 21 '16 at 00:01
  • @Did You are right. What does $X_{(1)}$ mean in this context? – Vydai Nov 21 '16 at 00:08
  • Your textbook probably explains that each $X_{(k)}$ is the $k$th smallest value in ${X_1,X_2,\ldots,X_n}$. Thus, $X_{(1)}$ is the minimum. – Did Nov 21 '16 at 00:11
  • https://math.stackexchange.com/questions/2659520/fx-theta-frac-thetax2-with-x-geq-theta-and-theta0-find-the-m?noredirect=1&lq=1, https://math.stackexchange.com/questions/2949033/maximum-likelihood-when-usual-procedure-doesnt-work?noredirect=1&lq=1 – StubbornAtom Oct 09 '19 at 06:50

2 Answers2

2

We want to maximise this probability.

$P(X_1,X_2,...,X_n|\theta)$

A formula is used to get

$P(X_1,X_2,...,X_n|\theta)=P(X_1|\theta)P(X_2|X_1,\theta)...P(X_n|X_1,X_2,...,X_n,\theta)$

If the experiments are independent, then the last result can be simplified to

$P(X_1,X_2,...,X_n|\theta)=P(X_1|\theta)P(X_2|\theta)...P(X_n|\theta)$

Taking the logarithm, we get

$\sum_{i}logP(X_i|\theta)$

Now, there is one thing to pay attention to. If $\theta$ is chosen to be greater than even one of $X_i$, then we have a term $logP(X_i|\theta)=log0$ in the sum, according to the probability model that you defined. Therefore, we would like to have $\theta$ to be less than all $X_i$. In this case the summation can be modified as

$\sum_{i}logP(X_i|\theta)=\sum_{i}log (\theta X_i^{-2})=nlog\theta+\sum_{i}logX_i^{-2}$

If there was no limit on $\theta$, it could have gone to infinity and therefore, it had no maximum. But it was assumed that $\theta $ is lees than the minimum of $X_i$, for all $i={1,2,...,n}$. So, $\theta=min{X_i}$

Med
  • 2,530
0

Your likelihood function is $$ L(\theta) = \theta^n \left[ \prod_{i=1}^n x_i^{-2} \cdot \mathbb{1}_{[x_i \ge \theta]} \right] =\theta^n \cdot \mathbb{1}_{[x_{(1)} \ge \theta]} \left[ \prod_{i=1}^n x_i^{-2} \right] $$ Now, $L(\theta)$ is increasing in $\theta$ as long as $\theta \le x_{(1)}$, and so the MLE is $\hat \theta_{MLE} = X_{(1)}$.

user365239
  • 1,988