6

Let $X_1,X_2,\dots, X_n$ be rvs with pdf: $$f(x\mid \theta)=\frac{1}{2\theta}I(-\theta<x<\theta)$$

Find UMVUE of $(i)\dfrac{\theta}{1+\theta}$ and $(ii)\dfrac{e^{\theta}}{\theta}$.

Note that, $(X_{(1)},X_{(n)})$ is complete sufficient statistic. But now I have to find unbiased estimator of $(i),(ii)$ of the form $g(X_{(1)},X_{(n)})$, then $g$ will become UMVUE. But I could not find such $g$. Thanks for any help.

I tried to find $E(X_{(1)}/X_{(n)})$, but it came out a total mess.

Here $X_{(1)}=\min(X_1,X_2,\dots, X_n)$ and $X_{(n)}=\max(X_1,X_2,\dots, X_n)$.

  • 1
    A complete sufficient statistic for $\theta$ is simply $\max |X_i|$. Did you try finding the unbiased estimators of i) and ii) ? – StubbornAtom Aug 21 '18 at 16:04
  • @StubbornAtom I wrote that I could not find u.e.. I am asking for a way (or a hint) to find u.e. – Stat_prob_001 Aug 21 '18 at 16:09
  • Regarding i), let $g(\theta)=\frac{\theta}{1+\theta}$, where $\theta$ is obviously positive.

    If $\theta>1$, then one could write $$g(\theta)=\left(1+\frac{1}{\theta}\right)^{-1}=1-\frac{1}{\theta}+\frac{1}{\theta^2}-\frac{1}{\theta^3}+\cdots$$

    If $0<\theta<1$, then $$g(\theta)=\theta(1+\theta+\theta^2+\cdots)$$...

    – StubbornAtom Aug 21 '18 at 17:11
  • ...So if one could find unbiased estimators of the form $\theta^k$ or $1/\theta^k$, then combining them he could get an unbiased estimator $T$ (say) of $g(\theta)$. By Lehmann-Scheffe theorem, $E(T\mid \max|X_i|)$ would be the UMVUE of $g(\theta)$. Note that $X_i\sim U(-\theta,\theta)\implies|X_i|\sim U(0,\theta)$ and $\max |X_i|$ is a complete sufficient statistic for the family. This is just a thought, since ultimately an unbiased estimator of $g(\theta)$ based on $\max |X_i|$ would be enough for the final answer. – StubbornAtom Aug 21 '18 at 17:11
  • @StubbornAtom very much thank you. Although I could not find any estimator of $1/\theta^k$, for $k\geq n$. Is it even possible to find unbiased estimator?? – Stat_prob_001 Aug 21 '18 at 18:53
  • For the family of ALL uniform distributions on bounded intervals, the minimal sufficient statistic is the pair whose components are the maximum and minimum observed values. But this present family contains only some uniform distributions, so it has a someone coarser sufficient statistic: the maximum absolute value. That is coarser because you cannot find the max and min if you know only the maximum absolute value. – Michael Hardy Aug 22 '18 at 20:22

1 Answers1

10

You have a $U(-\theta,\theta)$ population where $\theta\in\mathbb R^+$.

Joint density of the sample $\mathbf X=(X_1,X_2,\ldots,X_n)$ is

\begin{align} f_{\theta}(\mathbf x)&=\frac{1}{(2\theta)^n}\mathbf1_{-\theta < x_1, \ldots, x_n < \theta} \\&=\frac{1}{(2\theta)^n}\mathbf1_{0<|x_1|,\ldots,|x_n|<\theta} \\&=\frac{1}{(2\theta)^n}\mathbf1_{\max_{1\le i\le n}|x_i|<\theta} \end{align}

It is clear from Factorization theorem that a sufficient statistic for $\theta$ is $$T(\mathbf X)=\max_{1\le i\le n}|X_i|$$

One could verify that $|X_i|\sim U(0,\theta)$, so that the density of $T$ is $$g_{\theta}(t)=\frac{n}{\theta^n}t^{n-1}\mathbf1_{0<t<\theta}$$

That $T$ is a complete statistic for $\theta$ is well-known.

We simply have to find unbiased estimators of the parametric functions of $\theta$ based on the complete sufficient statistic. This would give us the UMVUE by the Lehmann-Scheffe theorem.

As the support of the complete sufficient statistic here depends on the parameter $\theta$, unbiased estimators can be directly obtained through differentiation.

Let $h_1(T)$ and $h_2(T)$ be unbiased estimators of $\theta/(1+\theta)$ and $e^{\theta}/\theta$ respectively, based on the complete sufficient statistic $T$.

That is, for all $\theta>0$,

\begin{align} \qquad\quad\frac{n}{\theta^n}\int_0^{\theta}h_1(t)t^{n-1}\,dt&=\frac{\theta}{1+\theta} \\\implies \int_0^{\theta}h_1(t)t^{n-1}\,dt &= \frac{\theta^{n+1}}{n(1+\theta)} \end{align}

Differentiating both sides wrt $\theta$,

\begin{align} h_1(\theta)\theta^{n-1}&=\frac{\theta^n(n\theta+n+1)}{n(1+\theta)^2} \\\implies h_1(\theta) &=\frac{\theta(n\theta+n+1)}{n(1+\theta)^2} \end{align}

Hence, $$h_1(T)=\frac{T(nT+n+1)}{n(1+T)^2}$$

Similarly for the second problem, for all $\theta>0$,

\begin{align} \qquad\quad\frac{n}{\theta^n}\int_0^{\theta}h_2(t)t^{n-1}\,dt&=\frac{e^\theta}{\theta} \\\implies \int_0^{\theta}h_2(t)t^{n-1}\,dt &= \frac{\theta^{n-1} e^\theta}{n} \end{align}

Differentiating both sides wrt $\theta$ yields

\begin{align} h_2(\theta)\theta^{n-1}&=\frac{e^{\theta}\theta^{n-2}(\theta+n-1)}{n} \\\implies h_2(\theta) &=\frac{e^{\theta}(\theta+n-1)}{n\theta} \end{align}

So, $$h_2(T)=\frac{e^{T}(T+n-1)}{nT}$$


In my initial answer, the following calculation for the UMVUE was rather unnecessary and complicated. Had the support not depended on the parameter, I might have tried this. I am keeping this part in the answer as I might be able to salvage the somewhat faulty argument on some further consideration :

For $k> -n$, we have

\begin{align} E_\theta(T^k)&=\frac{n}{\theta^n}\int_0^\theta t^{k+n-1}\,dt\\[8pt] & = \frac{n\theta^k}{n+k} \end{align}

This suggests that an unbiased estimator of $\theta^k$ based on $T$ is $$\left(\frac{n+k}{n}\right)T^k$$

For the first problem, one could write

\begin{align} \frac{\theta}{1+\theta}&= \begin{cases}\left(1+\frac{1}{\theta}\right)^{-1}=1-\frac{1}{\theta}+\frac{1}{\theta^2}-\frac{1}{\theta^3}+\cdots&,\text{ if }\theta>1\\\\\theta(1+\theta+\theta^2+\cdots)&,\text{ if }0<\theta<1\end{cases} \end{align}

For $0<\theta<1$, we have

$$E_{\theta}\left[\left(\frac{n+1}{n}\right)T+\left(\frac{n+2}{n}\right)T^2+\cdots\right]=\theta+\theta^2+\cdots$$

Or, $$E_{\theta}\left[\sum_{k=1}^\infty\left(\frac{n+k}{n}\right)T^k\right]=\frac{\theta}{1+\theta}$$

For $\theta>1$,

$$E_{\theta}\left[1-\left(\frac{n-1}{n}\right)\frac{1}{T}+\left(\frac{n-2}{n}\right)\frac{1}{T^2}-\cdots\right]=1-\frac{1}{\theta}+\frac{1}{\theta^2}-\cdots$$

That is, $$E_{\theta}\left[\sum_{k=0}^\infty\left(\frac{n-k}{n}\right)\frac{(-1)^k}{T^k}\right]=\frac{\theta}{1+\theta}$$

Hence by Lehmann-Scheffe theorem, UMVUE of $\theta/(1+\theta)$ is

\begin{align} h_1(T)&=\begin{cases}\displaystyle\sum_{k=1}^\infty\left(\frac{n+k}{n}\right)T^k&,\text{ if }0<\theta<1\\\\\displaystyle\sum_{k=0}^\infty\left(\frac{n-k}{n}\right)\frac{(-1)^k}{T^k}&,\text{ if }\theta\ge1 \end{cases} \\\\&=\begin{cases}\displaystyle\frac{T(n+1-nT)}{n(T-1)^2}&,\text{ if }0<\theta<1\\\\\displaystyle\frac{T(n+1+nT)}{n(T+1)^2}&,\text{ if }\theta\ge1\end{cases} \end{align}

However, upon verification of unbiasedness for some values of $n$, it looks like only $$h_1(T)=\displaystyle\frac{T(n+1+nT)}{n(T+1)^2}$$ should be the correct answer for all $\theta>0$. I am not quite sure why that happens.

For the second problem, we can use the power series expansion of $e^\theta$ to obtain

$$E_{\theta}\left[\sum_{k=-1}^{\infty}\left(\frac{n+k}{n}\right)\frac{T^k}{(k+1)!}\right]=\sum_{j=0}^\infty \frac{\theta^{j-1}}{j!}=\frac{e^{\theta}}{\theta}$$

So the UMVUE of $e^{\theta}/\theta$ is

\begin{align} h_2(T)&=\sum_{k=-1}^{\infty}\left(\frac{n+k}{n}\right)\frac{T^k}{(k+1)!} \\\\&=\frac{e^T(n-1+T)}{nT} \end{align}

StubbornAtom
  • 17,052
  • I don't understand why only $k\neq -n$. I found that for all $k\leq -n$, $\frac{n+k}{n}T^k$ is not u.e. of $\theta^k$. (It feels obvious to me, since the sample value of these estimators are negative, maybe I am wrong). Apart from that, it maybe possible to find another u.e. for these exceptional case (if possible). If not possible, then I guess, I need prove that it is not possible. Am I right? – Stat_prob_001 Aug 23 '18 at 05:15
  • It is quite possible that sample values of your (unbiased) estimators is negative, while the parameter to be estimated is positive. These are instances of inadmissible/absurd unbiased estimators, but they do not cease to be unbiased for this reason. I would ask you to verify whether the $h_1$ and $h_2$ I obtained are unbiased or not ( by taking particular values of $n$ if you like). – StubbornAtom Aug 23 '18 at 08:39
  • 1
    ah! I understand! But the problem I am getting is somewhere else. Take $k=-n-2$. Then: $$E(T^{-n-2})=\frac{n}{\theta^n}\int_{0}^{\theta}t^{-n-2+n-1}dt=\frac{n}{\theta^n}\int_{0}^{\theta}t^{-3}dt=\frac{n}{\theta^n}\int_{0}^{\theta}\frac{1}{t^3}dt$$ I can not integrate this. That is the problem. – Stat_prob_001 Aug 23 '18 at 10:11
  • 2
    @StubbornAtom, I think the OP meant to say that the integration $\int_0^\theta \frac{1}{t^3}dt$ is undefined at $0$, you can not integrate that thing in the first place. – Shanks Aug 23 '18 at 16:11
  • Oh I see. In that case @Stat_prob_001 you are right. We have to ignore the cases $k\le -n$ for this particular calculation. Don't think it changes the final answer though. – StubbornAtom Aug 23 '18 at 16:45
  • 1
    @Stat_prob_001 Do have a look at my edit. – StubbornAtom Aug 24 '18 at 17:48
  • Thanks @StubbornAtom – Stat_prob_001 Aug 25 '18 at 04:49
  • 1
    @StubbornAtom thanks for the update. This is definitely a very useful method. But, just for curiosity, I tried to use the that method to find UMVUE of $\theta^{-n-2}$, and it came out to be $-\frac{2}{n}T^{-n-2}$. But $E(-\frac{2}{n}T^{-n-2})$ do not exists. So, I think (maybe), this method can only be used if UMVUE of a function $\theta$ exists for all $\theta$. So, it is needed to prove that UMVUE of $\frac{\theta}{1+\theta}$ actually exists for all $\theta$. Is there is any way to establish such think? I mean how to prove UMVUE of $\frac{\theta}{1+\theta}$ exists? – Stat_prob_001 Aug 25 '18 at 08:12
  • 1
    @Stat_prob_001 There is a theorem that says a necessary and sufficient condition for an unbiased estimator to be the UMVUE of a given parametric function is that it must be uncorrelated with every unbiased estimator of zero. This result can be used to inspect whether UMVUE for a parametric function exists or not. – StubbornAtom Aug 25 '18 at 08:26