3

Consider a random sample $Y_1,\ldots,Y_n$ of the Uniform Distribution on the Interval $[-\phi,\phi]$

I'm wondering how I can show that the Statistic $$ T(\mathbf{Y}) = ( Y_{(1)} , Y_{(n)}) $$

is a Complete Statistic.

Thoughts so far :

the pdf of $T$ can be represented as $$ f(x,y) = n(n-1) \left(\frac{y-x}{2 \phi} \right)^{n-2} \frac{1}{4 \phi^2} \; \; \; \; - \phi < x < y < \phi $$ The expectation of any measurable function of $T$ can be represented as $$E[g(T)] = \int_{- \phi}^{\phi} \int_{x}^{\phi} g(x,y) f(x,y) \,dy\,dx $$

Setting this equal to $0$ however does not really let me continue very far. I cant deduce completeness

I can only guess that I'm on the wrong track but I dont know how else to attempt this.

  • By the usual Order Statistics formula for the joint pdf of the min and max –  Jan 22 '16 at 01:48
  • There is a nice theorem describing how to derive a complete statistic if your random sample comes from an exponential family. Check page 288 of Casella and Berger. – Mose Wintner Jan 22 '16 at 01:48
  • @MoseWintner , The Uniform Distribution with parameter determining its location is not an exponential distribution. Am I missing something? –  Jan 22 '16 at 01:49
  • No, my mistake. Briefly, a complete statistic is one where every function of it depends on the parameter. But in your case, since y-x is the range of the data raised to a power, if n is even and g is nonnegative, then g is 0 almost everywhere. If n is odd, pull off a power of y-x and try some stuff... – Mose Wintner Jan 22 '16 at 02:00
  • That still does not help though. what if g is neither strictly positive or negative but a mix of the two ? Also you have not used phi at all in an answer like that. –  Jan 22 '16 at 02:06
  • @MoseWintner : It's not "where every function of it depends on the parameter", but rather "where the EXPECTED VALUE OF every function of it depends on the parameter". $\qquad$ – Michael Hardy Jan 22 '16 at 02:09
  • $$\int_{-\varphi}^\varphi \int_x^\varphi g(x,y) n(n-1) \left(\frac{y-x}{2 \varphi} \right)^{n-2} \frac{1}{4 \varphi^2} , dy,dx$$ $$= \int_{-1}^1 \int_u^1 g(\varphi u,\varphi v) n(n-1) \left( \frac{v-u} 2 \right)^{n-2} \frac 1 4 , dv,du$$ $$= \frac{n(n-1)} {2^{n-2} \cdot 4} \int_{-1}^1 \int_u^1 g(\varphi u,\varphi v) (v-u)^{n-2} , dv,du.$$ Thus $\displaystyle \int_{-1}^1 \int_u^1 g(\varphi u,\varphi v) (v-u)^{n-2} , dv,du$ remains equal to $0$ as $\varphi$ runs from $0$ to $\infty$. The problem is to show that that implies $g$ is almost everywhere $0$. – Michael Hardy Jan 22 '16 at 02:55
  • 3
    Hi @MichaelHardy I don't think $(Y_{(1)},Y_{(n)})$ is complete. I've post my answer below. – Tan Jan 08 '21 at 02:32
  • $(Y_{(1)},Y_{(n)})$ is not even minimal sufficient. Keeping in mind that $|Y_i|\sim U(0,\phi)$, a minimal complete sufficient statistic is instead $\max{-Y_{(1)},Y_{(n)}}=\max_{1\le i\le n}|Y_i|$. – StubbornAtom Feb 26 '21 at 11:31

1 Answers1

3

Actually, $(Y_{(1)},Y_{(n)})$ is not a complete statistic for $U(-\phi,\phi)$.

Proof:

Method 1

If $(Y_{(1)},Y_{(n)})$ is complete statistic, since $(Y_{(1)},Y_{(n)})$ is sufficient it is also a minimal sufficient statistic by Bahadur's theorem.

We know that a minimal sufficient statistic for $U(-\phi,\phi)$ is $\max\{-Y_{(1)},Y_{(n)}\}$. Since minimal sufficient statistics are one-to-one corresponded, there should be a one-to-one map between $\max\{-Y_{(1)},Y_{(n)}\}$ and $(Y_{(1)},Y_{(n)})$ if $(Y_{(1)},Y_{(n)})$ is a minimal sufficient statistic. However, this is not possible since two different $(y_{(1)},y_{(n)})$ can have same $\max\{-y_{(1)},y_{(n)}\}$, e.g. when $(y_{(1)},y_{(n)})=(-1,0)$ or when $(y_{(1)},y_{(n)})=(0,1)$, $\max\{-y_{(1)},y_{(n)}\}$ are both $1$.

Method 2

We can write the p.d.f of $Y_{(n)}$ $$\delta_{Y_{(n)}}(y_n)=\frac{n}{(2\theta)^n}(y_n+\theta)^{n-1},y_n\in[-\phi,\phi]$$ and the p.d.f of $Y_{(1)}$ $$\delta_{Y_{(1)}}(y_1)=\frac{n}{(2\theta)^n}(\theta-y_1)^{n-1},y_1\in[-\phi,\phi].$$ Thus $$E_{\phi}[Y_{(n)}]=\frac{\phi(n-1)}{n+1}\text{ and }E_{\phi}[Y_{(1)}]=\frac{\phi(1-n)}{n+1}$$ This means if we let $g(x,z)=\frac{n+1}{n-1}z-\frac{n+1}{1-n}x$ we will get $$\mathbb{E}_{\phi}(g(Y_{(1)},Y_{(n)}))=0,\forall\phi\in\mathbb{R}\text{ but } g\not\equiv0,$$ which means $(Y_{(1)},Y_{(n)})$ is not complete.

Tan
  • 621
  • I really like this post, although I did find it a bit confusing when you said that $\max(0,1)$ and $\max(-1,0)$ are both $1$. Can you define a function $Q(x,y)=\max{-x,y}$ and instead say that $Q(0,1)$ and $Q(-1,0)$ are both $1$? – Matthew H. Jan 08 '21 at 03:56
  • Thanks for pointing this out. I've updated my answer @MatthewPilling And your $Q$ function is what I meant. – Tan Jan 08 '21 at 04:34