1

If $X_1,X_2...X_n$ are i.i.d. $Uniform(0, \theta)$, $x>0$ and $x \le \theta$, and $\theta>0$, Calculate $t(\theta) = E(X)$ and find the Best Unbiased Estimator.

a. Find $t(\theta) = E(X|\theta)$ and find the best unbiased estimator

Attempt: $E(X|\theta) = \theta/2$. A sufficient statistic for $\theta$ is $ max(X_1,....,X_n),$ so Is there a more formal way to prove this?

b. Does the Cramer Rao Bound hold here?

Attempt: The Cramer Rao Bound is dependent on the likelihood function of x given $\theta$; and $x \le \theta$ so it doesn't apply. Is there a more formal mathematical reason for this?

lord12
  • 1,958
  • What about some $\LaTeX$ to help your folks help you? – André Caldas Oct 13 '11 at 04:24
  • 1
  • 1
    Best unbiased estimator. – Emre Oct 13 '11 at 05:49
  • Hint: a) What is the definition of $E(X|\theta)$? b) 1)What is the likelihood function? 2) What are the conditions necessary for the Cramer Rao theorem? Finally, for a BUE, what is a sufficient statistic for $\theta$? – deinst Oct 13 '11 at 12:37
  • https://math.stackexchange.com/q/2941489/321264. For part (b), it is perfectly enough to say Cramer-Rao bound is not applicable in the first place because support of the population distribution depends on the parameter $\theta$, violating a regularity condition. Note that Fisher information is not defined in the usual sense for 'non-regular' distributions, so one shouldn't be bothered about computing it. – StubbornAtom May 15 '20 at 16:14

1 Answers1

2

I will resist the urge to paste a link to lmgtfy.com, but googling for Cramer Rao uniform distribution will get you many solutions to this problem, as it is nearly the canonical example of the failure of the Cramer Rao bound.

For part a)

If you do not know how to formally compute the expectation of a uniform distribution, spend the weekend reviewing basic probability.

The Lehman Scheffé theorem says that the BUE is a function of the sufficient statistic. I'll let you verify that the statistic is complete.

For part b)

The Cramer Rao lower bound can hold only when you can switch the order of differentiation and integration (among other regularity conditions) $$\frac{\partial}{\partial\theta}\int T(x)f(x;\theta)dx = \int T(x)\frac{\partial}{\partial\theta}f(x;\theta)dx$$ verify that this does not work and you will verify that the Cramer Rao bound does not hold.

Added as I have heavy duty procrastination syndrome

An unbiased estimator of $\theta$ is an estimator (a function $T(x_1,\dots,s_x)$) that is unbiased ($E(T(x_1,\dots,x_n)|\theta) = \theta$). In the case at hand let us let $T$ be a function of the sufficient statistic $\max(x_1,\dots,x_n)$ (it turns out that this will be the best unbiased estimator by the Lehman-Scheffé theorem, but let us not worry about that now.)

Let $Y=\max(X_1,\dots,X_n)$ be our sufficient statistic. We know that the density of $Y$, $f_Y(y)= ny^{n-1}/\theta^n$, $0<y<\theta$ from our study of order statistics. We can find that $$E(Y|\theta)=\int_0^\theta y\frac{ny^{n-1}}{\theta^n}dy=\frac{n}{n+1}.$$ So $\frac{n+1}{n}Y=\frac{n+1}{n}\max(X_1,\dots,X_n)$ is an unbiased estimator of $\theta$. Let us compute the variance of our estimator: $$\text{var}\left(\frac{n+1}{n}Y\right)=\left(\frac{n+1}{n}\right)^2\left[E(Y^2)-\left(\frac{n}{n+1}\theta\right)^2\right]$$ which equals $\frac{1}{n(n+2)}\theta^2$.

But the Fisher information of the uniform distribution is $$E\left[\left(\frac{\partial}{\partial\theta}\log(1/\theta)\right)^2\right]=\frac{1}{\theta^2}.$$ So the Cramer-Rao theorem says that the variance of any unbiased estimator must be greater than $\theta^2/n$. But we have an estimator that has variance $\frac{\theta^2}{n(n+2)}<\theta^2/n$. This is because we cannot switch integration and partial differentiation (essentially because $\theta$ appears in the integral bounds, but I'll let you work that out.)

Miqail
  • 13
deinst
  • 5,646
  • I know that the expectation is theta/2. And the best unbiased estimator of this would be max(xi...xn)/2? – lord12 Oct 13 '11 at 21:19
  • @lord What is the expected value of $\max(x_1,\dots,x_n)$? If $n=1$ then $E(\max(x)|\theta)=E(x|\theta)=\theta/2$. This translates to $2x$ being the best unbiased estimator (when $n=1$). Remember: an estimator $T(x)$ is unbiased when $E(T(x)|\theta)=\theta$. – deinst Oct 14 '11 at 00:06
  • Basically d/dtheta(integral(fX(x;theta)dx = integral(d/dtheta)fX(x;theta)dx is the regularity condition but the left hand side = 0, whereas the right hand side = -theta^-1 indicating the regularity condition is not satisfied. – lord12 Oct 14 '11 at 02:01
  • Yes, the regularity condition fails. – deinst Oct 14 '11 at 02:15
  • 2
    Pointing out a small error in the last paragraph: Actually, the addition of information across independent results requires regularity condition, which fails here. Thus, the Fisher information of 1 uniform is $1/\theta^2$, but the Fisher information of drawing $n$ uniforms i.i.d. is NOT $n/\theta^2$. Directly compute using the joint pdf of $f(x|\theta)=1/\theta^n$ gives $E\left[\left(\frac{\partial}{\partial\theta}\ln(1/\theta^n)\right)^2\right] = \frac {n^2}{\theta^2}$ – suncup224 Sep 06 '15 at 06:51
  • It is somewhat misleading to compute Fisher information when it is not defined in the first place for 'non-regular' distributions. – StubbornAtom May 15 '20 at 16:13