5

Let $X = (X_1, \dots, X_n)$ - a sample from the distribution $U (0,\theta)$. Prove that $T(X) = X_{(n)}$ is complete and sufficient estimation for $\theta$ and find the minimum-variance unbiased estimator $T^*(X)$ for a differentiable function $\tau(\theta)$.

The proof of sufficiency can be very easily carried out using the factorization criterion. I have done it.

Next we need to prove completness. By the definition we need to prove that $\mathbb{P}(g(x) = 0) = 1$ from $$\mathbb{E}_{\theta}g(T(X)) = \displaystyle\int\limits_{[0,\theta]}g(x)\frac{n y^{n-1}}{\theta^n}dx = 0, \quad \forall \theta > 0$$ It can be easily done if $g(x)$ continuous. $\displaystyle\int\limits_{0}^\theta g(x)y^{n-1}dx = 0 $

$g(\theta)\theta^{n-1} = 0$. Than $g(\theta) = 0$. It works for continuous functions but how to prove it for all $g(x)$ such that $\mathbb{E}_{\theta}g(T(X))$ exists?

And next I need to find the minimum-variance unbiased estimator $T^*(X)$ for a differentiable function $\tau(\theta)$. It seems like it is connected with the first question and can be done using something like Lehmann–Scheffé theorem, but I do not know how to do it exactly.

Great thanks for the help!

StubbornAtom
  • 17,052
  • https://math.stackexchange.com/questions/699997/complete-statistic-uniform-distribution?noredirect=1&lq=1 – StubbornAtom Jan 14 '20 at 16:05
  • By Lehmann-Scheffe, $f(T)$ is UMVUE of $\tau(\theta)$ for some function $f$ if $E_{\theta}[f(T)]=\tau(\theta)$ for all $\theta$. One can solve for $f$ by differentiating both sides of the last equation wrt $\theta$. – StubbornAtom Jan 14 '20 at 20:35

1 Answers1

2

You have $$E_{\theta}[g(T)]=\int_0^\theta g(x)\frac{nx^{n-1}}{\theta^n}\,dx,\quad\theta>0,$$ for every measurable function $g:(0,\infty)\rightarrow\mathbb{R}$ such that $g(x)x^{n-1}$ is Lebesgue integrable on $(0,\theta)$ for all $\theta>0$. By the Fundamental Theorem of Calculus for the Lebesgue integral, $$\frac{d}{d\theta}\int_0^\theta g(x)\frac{nx^{n-1}}{\theta^n}\,dx=g(\theta)\frac{n\theta^{n-1}}{\theta^n}=ng(\theta)/\theta,\quad\text{a.e. }\theta>0.$$ Thus, if $E_\theta[g(T)]=0$ for all $\theta>0$, the derivative is also zero for all $\theta>0$ and therefore $g=0$ for almost every $\theta>0$. Then $P(g(T)=0)=1$, as wanted.

If you want the minimum variance unbiased estimator of $\tau(\theta)$, you need first of all an unbiased estimator of $\tau(\theta)$, say $W=W(X)$. For a general $\tau$, I do not know if it can be found. But, for example, if $\tau(\theta)=\theta$, then you could take $W=(n+1)/n \,T$ as proved in the accepted answer here. As $T$ is sufficient and complete for $\theta$, by the Lehmann-Scheffé Theorem $T^*=E[W|T]$ is the minimum variance unbiased estimator of $\tau(\theta)$. For instance, if $\tau(\theta)=\theta$ again, $T^*=(n+1)/n\,E[T|T]=(n+1)/n\,T$.

user39756
  • 1,549