2

I am having trouble understanding how to compute $\operatorname E[\bar{X}\mid X_{(n)}]$ related to the following premise.

Compute the UMVUE using the Rao–Blackwell Theorem for the following. $X_1,X_2, \ldots ,X_n$ i.i.d. to $\operatorname{Uniform}(0,\theta)$.

I am able to derive that $\hat{\theta}_\text{MM}=2\bar{X}$ and $\hat{\theta}_\text{MLE}=X_{(n)}$.

Since $\hat{\theta}_\text{MM}$ is unbiased and $X_{(n)}$ is a sufficient estimator, I know that $$\operatorname E[2\bar{X}\mid X_{(n)}]$$ must give us the UMVUE.

However, I have no idea how to proceed from here. I appreciate your help.

hyg17
  • 5,117
  • 4
  • 38
  • 78
  • 2
    You wrote: "Since $\hat{\theta}\text{MM}$ is unbiased and $X{(n)}$ is a sufficient estimator, I know that $$\operatorname E[2\bar{X}\mid X_{(n)}]$$ must give us the UMVUE." That is not correct as it stands: It's true if the sufficient statistic is complete. A full solution would prove completeness. – Michael Hardy Oct 04 '18 at 04:21
  • UMVUE is just $E[2X_1\mid X_{(n)}]$ and for that you can refer to https://math.stackexchange.com/q/261530/321264. – StubbornAtom Feb 14 '20 at 20:52

2 Answers2

3

Note that $X_{(n)}$ is a complete sufficient statistic for $\theta$. By Lehmann-Scheffe theorem, UMVUE of $\theta$ is that function of $X_{(n)}$ which is unbiased for $\theta$. So the UMVUE must be $\left(\frac{n+1}{n}\right)X_{(n)}$ as shown here.

By Lehmann-Scheffe, UMVUE is equivalently given by $E\left[2X_1\mid X_{(n)}\right]$ or $E\left[2\overline X\mid X_{(n)}\right]$. As UMVUE is unique whenever it exists, it must be that $$E\left[2X_1\mid X_{(n)}\right]=E\left[2\overline X\mid X_{(n)}\right]=\left(\frac{n+1}{n}\right)X_{(n)}$$

To find the conditional expectation somewhat intuitively, note that $X_1=X_{(n)}$ with probability $\frac1n$ as any value is equally likely to be the maximum for i.i.d continuous variables, and $X_1<X_{(n)}$ with probability $1-\frac1n$. Moreover, given $X_{(n)}=t$, distribution of $X_1$ conditioned on $X_1<t$ is uniform on $(0,t)$.

As shown by @spaceisdarkgreen, it follows from law of total expectation that

\begin{align} E\left[X_1\mid X_{(n)}=t\right]&=E\left[X_1\mid X_1=t\right]\cdot\frac1n+E\left[X_1\mid X_1<t\right]\cdot\left(1-\frac1n\right) \\&=\frac{t}{n}+\frac{t}{2}\left(1-\frac1n\right)=\left(\frac{n+1}{2n}\right)t \end{align}

Note that $E[X_1\mid X_1<t]=\frac{E[X_1\mathbf1_{X_1<t}]}{P(X_1<t)}=\frac t2$ can be directly verified for any $t\in(0,\theta)$.

More rigorous answers can be found here and here for instance.


An alternative method for finding the conditional expectation is using Basu's theorem.

Since $\frac{X_1}{X_{(n)}}=\frac{X_1/\theta}{X_{(n)}/\theta}$, its distribution is free of $\theta$ (an ancillary statistic). By Basu's theorem, $\frac{X_1}{X_{(n)}}$ is independent of the complete sufficient statistic $X_{(n)}$.

Due to independence, $$E[X_1]=E\left[\frac{X_1}{X_{(n)}}\cdot X_{(n)}\right]=E\left[\frac{X_1}{X_{(n)}}\right]\cdot E[X_{(n)}]$$

Therefore,

\begin{align} E\left[X_1\mid X_{(n)}\right]&=E\left[\frac{X_1}{X_{(n)}}\cdot X_{(n)}\,\Big|\, X_{(n)}\right] \\&=X_{(n)}E\left[\frac{X_1}{X_{(n)}} \,\Big|\, X_{(n)}\right] \\&=X_{(n)}E\left[\frac{X_1}{X_{(n)}}\right] \\&=X_{(n)}\frac{E[X_1]}{E[X_{(n)}]} \end{align}

Needless to say, the same calculation holds for $E\left[\overline X\mid X_{(n)}\right]$.

StubbornAtom
  • 17,052
1

We use the fact that, conditional on the max, the lower order statistics are distributed like the order statistics of iid $U(0,X_{(n)}).$ So, conditional on $X_{(n)}$ $$\bar X \sim \frac{X_{(n)} + \sum_{i=1}^{n-1}U_i}{n}$$ where the $U_i$ are i.i.d. $U(0,X_{(n)}).$ Then we have $$ E(\bar X\mid X_{(n)}) = \frac{X_{(n)}+\frac{n-1}{2}X_{(n)}}{n} = \frac{(n+1)}{2n}X_{(n)}.$$

(We could have also derived this by the "what else could it possibly be?" method. It needs to be unbiased, a statistic, and a dimensionally sensible function of $X_{(n)}$... there's only one game in town here.)

  • Wow, I have no idea what you did. Thank you for your elegant proof, though. Hopefully someday I come back to this problem and understand what you said. – hyg17 Oct 04 '18 at 05:46
  • @hyg17 How do you know it’s elegant if you don’t understand it :). Seriously, though, if you can turn your confusion into a question, feel free to ask. The fact that the others are uniform conditional on the max comes up a lot. See the uniform example on pg 5, here http://dept.stat.lsa.umich.edu/~moulib/Conditional-distributions.pdf – spaceisdarkgreen Oct 04 '18 at 19:47