0

Let $Y_{(n)}=\max(Y_{1},Y_{2},\ldots,Y_{n})$, where $Y_{1},Y_{2},\ldots,Y_{n}$ is a sample uniform distribution in $(0,\theta)$. Find the MVUE for $\theta$.

My approach:

  1. I know that $U=Y_{(n)}$ is a sufficient statistic for $\theta$. I could prove this using the factorization theorem by Neymann.
  2. I know that $\displaystyle h(U)=\left(\frac{n+1}{n}\right)U$ satisfy that $\mathbb{E}[h(U)]=\theta$.

So, $$\left(\frac{n+1}{n}\right)Y_{(n)}$$ is a (or the?) MVUE for $\theta$.


My approach is correct? Can I find other MVUE for $\theta$?

Adam Zalcman
  • 3,446
  • Rao-Blackwell is a useful tool! – Math-fun Dec 18 '20 at 19:54
  • You can't claim that is the MVUE for $\theta$ unless you can show $Y_{(n)}$ is BOTH complete and sufficient. Once you do, use Lehmann–Scheffé. – Clarinetist Dec 18 '20 at 19:54
  • @Clarinetist but I could prove that $Y_{(n)}$ is sufficient for $\theta$, so $h(Y_{(n)})=\left(\frac{n+1}{n}\right)Y_{n}$ is the MVUE for $\theta$, since that $\mathbb{E}h(Y_{(n)})=\theta$ or this prove is not valid? –  Dec 18 '20 at 19:59
  • @Math-fun can you explain me how can I use the Rao-Blackwell theorem here? I read about this, but I don't understand well how can I use that. I would appreciate if you can write an answer with some explanation of that approach. –  Dec 18 '20 at 20:01
  • I was just opting for a comment not solution :-) ... I think you should check the Rao-Blackwell theorem. check also this https://math.stackexchange.com/questions/69398/whats-the-difference-between-rao-blackwell-theorem-and-lehmann-scheff%c3%a9-theorem?rq=1 – Math-fun Dec 18 '20 at 20:10
  • @Math-fun okis :), I will read that link. –  Dec 18 '20 at 20:12
  • 1
    enjoy the process! – Math-fun Dec 18 '20 at 20:13
  • https://math.stackexchange.com/q/2941489/321264 – StubbornAtom Jan 27 '21 at 10:25

1 Answers1

1

As suggested, to confirm that $\frac{n+1}{n}Y_{(n)}$ is UMVUE you have only to prove that $Y_{(n)}$ is CSS (Complete and Sufficient Statistic)

  • Sufficiency is easy proved observing that

$$L(\theta)=\underbrace{1}_{=h(\mathbf{y})}\times\underbrace{\frac{1}{\theta^n}\mathbb{1}_{(y_{(n)};\infty)}(\theta)}_{g[t(\mathbf{y}),\theta]}$$

  • To prove completeness, as the model does not below to the Exponential family, you must use the definition. Say you have to prove that, for all $g$ measurable and for all $\theta$

$$\mathbb{E}_{\theta}g(T)=0\rightarrow \mathbb{P}_{\theta}[g(T)=0]=1$$

In your case you have

$$0=\mathbb{E}_{\theta}g(T)=\int_0^{\theta}g(t)\frac{nt^{n-1}}{\theta^n}dt=\frac{1}{\theta^n}\underbrace{\int_0^{\theta}g(t)nt^{n-1}dt}_{=0}$$

Now take the derivative w.r.t. $\theta$ obtaining

$$0=\frac{1}{\theta^n}g(\theta)n\theta^{n-1}=\frac{ng(\theta)}{\theta}$$

Thus actually expectation =0 implies also $g=0$ and $Y_{(n)}$ is complete

Now you have only to apply the following theorem

enter image description here

tommik
  • 32,733
  • 4
  • 15
  • 34
  • That was an excellent explication. Thank so much! –  Dec 19 '20 at 06:50
  • If the model belong a exponential familly, so how could I solve a similar problem? –  Dec 20 '20 at 10:42
  • In this case the exercise is easier because the canonical statistic is CSS – tommik Dec 20 '20 at 10:46
  • Oh, thanks! Can you recommend any text where you address these issues of inferential statistics from the point of view of mathematical statistics? Thanks a lot –  Dec 20 '20 at 11:03
  • First best: Casella Berger; second best: Mood Graybill Boes (IMHO) – tommik Dec 20 '20 at 11:07
  • Consider the density $f(x)=\theta e^{-\theta x}$. $S=\Sigma x$ is CSS and $T=\frac{n-1}{\Sigma x}$ is unbiased for $\theta$ and function of $S$. Thus T is UMVUE for $\theta$ – tommik Dec 20 '20 at 11:23
  • Thank so much tommik :-) –  Dec 21 '20 at 06:53