1

Let $x_1=1, x_2=4$ be the data on a random sample of size $2$ from a Poisson($\theta$) distribution, where $\theta \in (0,\infty)$. Let T be the uniformly minimum variance unbiased estimate of T($\theta$)= $\sum_{k=4}^{\infty}\frac{e^{-\theta}\theta^k}{k!}$ based on the given data. Then T equals____?

I am using $x$ instead of $k$.

$P(X\ge 4)=\sum_{x=4}^{\infty}\frac{e^{-\theta}\theta^x}{x!}$

$I(X_1)=1$ if $X\ge4$,$0$ otherwise

So using Rao–Blackwell theorem

$E(I(X_1)|T=\sum_{i}X_i)$ =$1\cdot P\left( \dfrac{X_1\ge4,X_1+X_2+...+X_n=T}{T=\sum_{i}X_i} \right)$

How do I solve this? if I was given $X_1=1$ I would've utilized it and then used the fact of independence but I am not sure how to proceed here. Also, tell me if there is any other way possible way to solve this problem.

Daman
  • 2,138

2 Answers2

1

Let $g(T)$ be the unbiased estimator of $\tau(\theta)=\sum\limits_{k=4}^{\infty}\frac{e^{-\theta}\theta^k}{k!}$ based on $T=X_1+X_2\sim \text{Poisson}(2\theta)$.

That is, $$\operatorname E_{\theta}\left[g(T)\right]=\tau(\theta)\quad,\forall\,\theta>0$$

Or, $$\sum_{t=0}^\infty g(t) \frac{e^{-2\theta}(2\theta)^t}{t!}=\sum_{k=4}^{\infty}\frac{e^{-\theta}\theta^k}{k!}\quad,\forall\,\theta$$

So for every $\theta$,

\begin{align} \sum_{t=0}^\infty g(t)\frac{2^t}{t!}\cdot\theta^t &=e^{\theta}\sum_{k=4}^{\infty}\frac{\theta^k}{k!} \\&=\sum_{\ell=0}^\infty \frac{\theta^\ell}{\ell!}\sum_{k=4}^{\infty}\frac{\theta^k}{k!} \\&=\sum_{t=0}^\infty \left(\sum_{k+\ell=t,k\ge 4,\ell\ge 0}\frac1{k! \ell!}\right)\theta^t \end{align}

Comparing coefficients of $\theta^t$,

\begin{align} g(t)&=\frac{t!}{2^t} \sum_{k+\ell=t,k\ge 4,\ell\ge 0}\frac1{k! \ell!} \\&=\frac1{2^t} \sum_{k=4}^t \frac{t!}{k!(t-k)!}\mathbf1_{\{t\ge 4\}} \\&=\begin{cases} \frac1{2^t} \sum\limits_{k=4}^t \binom{t}{k} &,\text{ if }t\ge 4 \\ 0 &,\text{ if }0\le t\le 3\end{cases} \end{align}

This suggests that UMVU estimate of $\tau(\theta)$ is $0$ when observed value of $T$ is less than $4$ (which is of course not the case here). As pointed out by @Henry, $g(T)$ is probably not a useful estimator in that situation.

StubbornAtom
  • 17,052
  • To be honest I didn't get henrys part. I think I am not there yet or I didn't understand it properly. Anyways thanks to you. – Daman Feb 02 '21 at 12:38
0

Using help from stubbornAtom's comments $n=2,X_1+X_2=t=5$

$1\cdot P\left( \dfrac{X_1\ge4,X_1+X_2=t}{T=\sum_{i}X_i} \right) =P(X_1\ge4|X_1+X_2=t)=\frac1{2^t}\sum_{j=4}^t \binom{t}{j})=\frac1{2^5}\sum_{j=4}^5 \binom{t}{j})=\frac{1}{2^5}(\binom{5}{4}+\binom{5}{5})=\frac{1}{32}(5+1)=\frac{6}{32}$

Daman
  • 2,138
  • 2
    I think your answer is correct, but suppose you instead had $x_1=1, x_2=2$. Then this method would have produced an estimate of $\hat{T}=0$, which is too low to be useful, casting doubt on the value of a uniformly minimum variance unbiased estimate here. – Henry Feb 02 '21 at 09:07
  • @Henry Thanks for extra information. Could you please tell me if there is any other way to solve this porblem – Daman Feb 02 '21 at 10:15
  • Your approach answers the UMVUE question as posed but I was questioning the utility of UMVUE here. A naive alternative would be to use $\hat\theta = \frac12(X_1+X_2)$ as an estimator of $\theta$ and then find $\sum_{k=4}^{\infty}\frac{e^{-\hat\theta}\hat\theta^k}{k!}$ but this would be biased upwards; with $x_1=1,x_2=4$ it would give about $0.24$ compared to your $0.19$; with $x_1=1,x_2=2$ it would give about $0.07$ compared to your answer's $0$ – Henry Feb 02 '21 at 10:44
  • @Henry I will discuss this with you soon. – Daman Feb 02 '21 at 12:40