0

Assume the standard situation, that is, let $X_1, \ldots , X_n$ be independent and identically distributed with $X_k ∼ P_\theta(x; \theta)$ , where $P_\theta(x; \theta) = 2x/\theta^2$ if $0\le x \le\theta$ and $0$ otherwise.

It is required to estimate $\theta$. Show that the maximum likelihood estimator for $\theta$ is $\hat{\theta} = \max{[X_1, \ldots , X_n]}$ and then show that the cumulative distribution function of $\hat{\theta}$ is $F_\theta(z) = z^{2n}/\theta^{2n}$.

Here's what i did so far:

Maximum Likehood estimator: $L_x(θ) = \prod_{i = 1}^{n} P_θ(x_k)$ Here we have $P_θ(x_1,...,x_n;θ) = P_θ(x_1,θ).P_θ(x_2,θ)...P_θ(x_n,θ)$

Likehood = $L_{x,θ}(θ) =P_θ(x_1,θ).P_θ(x_2,θ)...P_θ(x_n,θ)= 2x_1/θ^2. 2x_2/θ^2.. 2x_n/θ^2 = [2^n.\prod_{i = 1}^{n}x_i]/θ^{2n}$

Log-likehood: $\sum_{i = 1}^{n}log(P_θ(x_1,...x_n;θ))= \sum_{i = 1}^{n}log(2x_i/θ^2)$

Is this correct so far? Im still not sure how to get to $\hat{θ} = max{[X_1, . . . , X_n]}$

As for the the cumulative distribution part to show $F_θ(z) = z^{2n}/θ^{2n}$:

$F(z) = P(max({x_k})<z) = P(x_1<z).P(x_2<z)..P(x_n<z) = 2x_1/θ^2.2x_2/θ^2...2x_n/θ^n = 2^n.\prod_{i = 1}^{n}x_i/θ^2n$

Not sure if this is correct. Would really appreciate some help.

Edit: From the answers below, we can deduce the the estimator is biased. What estimator would be unbiased? How can i find it?

StubbornAtom
  • 17,052
Neels
  • 481
  • 3
  • 9
  • 1
    Argue that the likelihood is a decreasing function of $\theta$, so it is maximised for the minimum possible value of $\theta$. For the second part, you have the right idea but the probabilities are not correct. – StubbornAtom Feb 14 '19 at 19:52
  • https://math.stackexchange.com/q/383587/321264 – StubbornAtom Jan 07 '21 at 18:27

1 Answers1

1

Given a sample $x\equiv \{x_i\}_{i=1}^n$, the likelihood is $$ L(\theta\mid x)=\left(\frac{2}{\theta^{2}}\right)^n\prod_{i=1}^n x_i \times1\{\theta\ge M(x),m(x)\ge 0\}, $$ where $M(x):=\max_{1\le i\le n}x_i$ and $m(x):=\min_{1\le i\le n}x_i$. The indicator suggests that $\hat{\theta}_n(x)\ge M(x)$ ($\because$ $L=0$ otherwise). However, taking values larger than $M(x)$ decreases $L$ because of the first term (assuming that $m(x)> 0$). Thus, $\hat{\theta}_n(x)= M(x)$.


As for the distribution of $\hat{\theta}_n$, for $z\in [0,\theta]$, $$ F(z)=\mathsf{P}(\hat{\theta}_n\le z)=\prod_{i=1}^n\mathsf{P}(X_i\le z)=\prod_{i=1}^n \left(\frac{z}{\theta}\right)^{2}=\left(\frac{z}{\theta}\right)^{2n}. $$


Since $\mathsf{E}X_i=2\theta/3$, examples of an unbiased estimator are $$ \hat{\theta}_n'=\frac{3}{2}\times \frac{1}{n}\sum_{i=1}^n x_i, \quad \hat{\theta}_n''=\frac{2n+1}{2n}\hat{\theta}_n. $$

  • Very helpful. Thank you – Neels Feb 14 '19 at 20:17
  • I have another question if you dont mind answering, how can it be deduced that the estimator is biased, and what would be an unbiased estimator(how do i find it)? – Neels Feb 14 '19 at 21:06
  • The first part im assuming is quite obvious; its simply that M(x) does not equal θ. Is this correct? – Neels Feb 14 '19 at 21:06
  • Using the expression for $F(z)$: $$ \mathsf{E}\hat{\theta}_n=\int_0^{\theta}(1-F(z))dz=\frac{2n\theta}{2n+1}\ne\theta. $$ However, it's asymptotically unbiased since $\mathsf{E}\hat{\theta}_n\to \theta$. –  Feb 14 '19 at 21:09
  • Thanks a lot. May i ask, is there are formula that suggest E(θ^n) = the integral of (1-F(x))? And is this the standard method to find unbiased estimators? – Neels Feb 14 '19 at 21:13
  • https://math.stackexchange.com/questions/172841/explain-why-ex-int-0-infty-1-f-x-t-dt-for-every-nonnegative-rando –  Feb 14 '19 at 21:16
  • It is a method to find the expectation of $X$ given its cdf. –  Feb 14 '19 at 21:24
  • Got it. Thanks alot. Really appreciate the help! – Neels Feb 14 '19 at 21:40