4

Consider a sample from a distribution with PDF

$$f(x) = \begin{cases} \frac{1}{2}(1+\theta x), & -1 \leq x \leq 1\\ 0, & otherwise \end{cases} $$

find the maximum likelihood estimator of $\theta $.

I know that

$$ L(p) = f(x_{1},\theta) . f(x_{2},\theta) ... f(x_{n},\theta)$$ Therefore, $$ L(p) = \frac{1}{2}(1+\theta x_{1}). \frac{1}{2}(1+\theta x_{2})...\frac{1}{2}(1+\theta x_{n}) = \frac{1}{2}^n.(1+\theta x_{1}).(1+\theta x_{2})...(1+\theta x_{n}) $$

Now how can I formalize the remaining product part so that I can obtain MLE of $\theta$ ?

Thank you for your help.

cecemelly
  • 195
  • Sorry, c=1/2, I already found that. Correcting this. – cecemelly Mar 23 '15 at 18:39
  • 2
    Now take the log-likelihood which simplifies the product: $$\ln(L(θ))=n\cdot \ln (1/2)+\sum_{k=1}^{n}\ln(1+θx_k)$$ The log-likelihood is maximized at the same $\hat{θ}$ as the likelihood (due to monotonicity of $\ln$) and therefore it suffices to determine the argmax of the log-likelihood by differentiating and setting the derivative equal to zero: $$\frac{\partial}{\partial θ} \ln(L(θ))=\sum_{k=1}^{n}\frac{x_k}{1+θx_k}\overset{!}=0$$ But I do not know how to proceed from here. – Jimmy R. Mar 23 '15 at 18:45
  • Thank you for your help I will try to proceed from here. – cecemelly Mar 23 '15 at 19:21

0 Answers0