0

Let $X_1, X_2, ..X_n$ represent a random sample from this pdf: $$f(x|\theta)= \frac{3x^2}{\theta^3}, 0\leq x\leq \theta$$ (with 0 elsewhere)

Could someone explain how I would find the maximum likelihood estimator (MLE) of this pdf? I know I have to get the likelihood function L, then the log likelihood function, then differentiate, but I am getting lost in the calculations. Thanks!

  • 2
    Please show us your workings. What I'd expect to have is that the log likelihood function is a decreasing function of $\theta$. Note that part of the likelihood function is $\log \mathbb{1}{x \in [0,\theta]^n}$, so the maximum value must lie in $\theta \geq \max{1 \leq i \leq n}(x_i)$. – fGDu94 Feb 05 '20 at 20:08
  • 1
    Look at the similar question: https://math.stackexchange.com/q/3527189/ – NCh Feb 05 '20 at 20:24
  • 1
    There are tons of questions here dealing with problems where support depends on parameter. Please search the site and you will find that differentiation is not the way to go. – StubbornAtom Feb 05 '20 at 20:36
  • See https://math.stackexchange.com/q/383587/321264, https://math.stackexchange.com/q/1992115/321264 for example. – StubbornAtom Feb 06 '20 at 19:49

1 Answers1

2

$$ \mathcal{L}(\theta; X_1,\dots, X_n) = f(X_1)\cdot\ldots\cdot f(X_n) $$ $$= \begin{cases}\frac{3X_1^2}{\theta^3}, & 0\leq X_1\leq \theta \cr 0, & \text{elsewhere}\end{cases}\;\times\;\ldots\;\times\; \begin{cases}\frac{3X_n^2}{\theta^3}, & 0\leq X_n\leq \theta \cr 0, & \text{elsewhere}\end{cases} $$ $$ =\begin{cases}\frac{3^n X_1^2\cdot\ldots\cdot X_n^2}{\theta^{3n}}, & 0\leq X_1,\ldots,X_n\leq \theta \cr 0, & \text{elsewhere}\end{cases} $$ $$ =\begin{cases}\frac{3^n X_1^2\cdot\ldots\cdot X_n^2}{\theta^{3n}}, & \max(X_1,\ldots,X_n)\leq \theta \cr 0, & \text{elsewhere}\end{cases} $$ $$ =\begin{cases}\frac{3^n X_1^2\cdot\ldots\cdot X_n^2}{\theta^{3n}}, & \theta\geq \max(X_1,\ldots,X_n) \cr 0, & \theta< \max(X_1,\ldots,X_n) \end{cases} $$

Since $\frac{3^nX_1^2\cdot\ldots\cdot X_n^2}{\theta^{3n}}$ decrease as $\theta$ increase, the highest value of $\mathcal{L}(\theta; X_1,\dots, X_n)$ is attained at the smallest value of $\theta$ satisfying the inequality $\theta\geq \max(X_1,\ldots,X_n)$. So, MLE is $\hat\theta = \max(X_1,\ldots,X_n)$.

NCh
  • 13,807
  • 4
  • 17
  • 33