0

Let $X_1,X_2,\dots, X_n$ be rvs with pdf: $$f(x\mid \theta)=\frac{1}{2\theta}I(-\theta<x<\theta)$$

I know that $Y=\max|X_{i}|$ is a sufficient statistics for $\theta$ and I found an unbiased estimator $\hat\theta$ of $\theta$,
$\hat\theta = c_n(X_{(n)}-X_{(1)})$, where $X_{(n)}$ is the largest sample, $X_{(1)}$ is the smallest sample and $c_n = (n+1)/(2n-2)$ is a constant which makes $\hat\theta$ unbiased estimator of $\theta$!

So, by Rao-Blackwell theorem, I should calculate conditional expectation $E(\hat\theta | Y)$ to improve $\hat\theta$!!
I tried to calculate this by using that
$$\max|X_i| = \max(X_{(n)},-X_{(1)})$$

However, since $X_{(n)},-X_{(1)}$ are not independent, I am having a trouble with calculating this conditional expectation!

StubbornAtom
  • 17,052
Hs P
  • 59
  • 1
    Try with the simple unbiased estimator $2|X_i|$ instead. – StubbornAtom May 05 '19 at 16:22
  • Because of homework.... – Hs P May 05 '19 at 16:34
  • 1
    Why not choose a simpler unbiased estimator that makes the job easier? The improved estimator by Rao-Blackwell is $2E(Y_1\mid Y_{(n)})$ where $Y_i=|X_i|\sim U(0,\theta)$. This conditional mean can be worked out in a similar manner as in this post. Since $2E(Y_1\mid Y_{(n)})$ is in fact the UMVUE of $\theta$, the final answer can also be guessed from the Lehmann-Scheffe theorem. – StubbornAtom May 06 '19 at 10:22
  • 1
    See https://math.stackexchange.com/questions/261530/finding-ex-1-max-x-i for direct evaluation of $E(Y_1\mid Y_{(n)})$. – StubbornAtom May 06 '19 at 10:30
  • Thank you for your comment !! I'll try to manage it with your comment and link – Hs P May 06 '19 at 13:55

0 Answers0