2

Show that for a random sample of size $n$ from the distribution $f(x)=e^{-(x-\theta)} , x>\theta$ , $2n[X_{(1)}-\theta] \sim \chi^2_{2}$ distribution and $2\sum_{i=2}^{n}[X_{(i)}-X_{(1)}]$ also has the $\chi^2_{2n-2}$ distribution and is independent of the first statistic. Here, $X_{(i)}$ is defined as the $i$ th order statistic.

My approach:

I did the following series of transformations: $(X_1,X_2,..,X_n) \rightarrow (Y_1,Y_2,...,Y_n) \rightarrow (Y_{(1)},Y_{(2)},...,Y_{(n)}) \rightarrow (U_1,U_2,...U_n)$

where $Y_i=X_i-\theta$ , $U_1=2nY_{(1)}$ and $U_{i}=2(Y_{(i)}-Y_{(1)}) \ \text{for i =2,3,...n}$

SO, first the joint pdf of $X_1,X_2,...X_n$ is given by

$f(x_1,x_2,...x_n)=e^{-\sum_{i=1}^{n}(x_i-\theta)} I_{x_i > \theta}$

Again, you can see $f(y_1,y_2,..,y_n)=e^{-\sum y_i} I_{y_i>0}$ Now, the joint pdf of order statistics $f_{1,2,...n}(y_1,..y_n)=n!e^{-\sum y_i} I_{y_1<y_2<...<y_n}$ Now transforming to $U$, the jacobian of transformation comes to be $\frac{1}{n2^n}$ Thus, $f(u_1,u_2,..u_n)=\frac{(n-1)!}{2^n}e^{\frac{-\sum u_i}{2}}$ From here I can deduce $u_1 \sim \chi^2_{2}$ But I cannot deduce anything from the remaining. Help!

StubbornAtom
  • 17,052
  • You know the PDF of $(U_2,\ldots,U_n)$ and you are after the PDF of $V=U_2+\cdots+U_n$, right? There are entirely standard techniques to do that, did you try one? – Did May 03 '18 at 06:25
  • But that is not working..Did I make any mistakes in the above calculations? Because one $(n-1)!$ is coming in and I don't know how to handle that –  May 03 '18 at 06:31
  • Sorry but what is not working? Once again, if your computations are correct, you know that the PDF of $(U_1,U_2,\ldots,U_n)$ is $$f(u_1,u_2,\ldots,u_n)=2^{-n}(n-1)!\exp(-\tfrac12(u_1+u_2+\cdots+u_n))\mathbf 1_{u_1>0}\mathbf 1_{0<u_2<u_3<\cdots<u_n}$$ and you are after the PDF of $$V=U_2+\cdots+U_n$$ Please describe precisely that which is stopping you. – Did May 03 '18 at 06:57
  • Your joint pdf of $(X_1,\cdots,X_n)$ is not correct. – StubbornAtom May 03 '18 at 07:06
  • I am stuck because $\sum_{i=2}^{n} u_i$ is supposed to follow chi-squared $2n-2$ which is not coming by my calculations –  May 03 '18 at 07:23
  • "which is not coming by my calculations" Again: show these. – Did May 03 '18 at 09:59
  • Ok, let $(u_2,u_3,...,u_n) \rightarrow (v_2,v_3,...,v_n)$ Now, $v_2=\sum_{i=2}^{n} u_i ,v_3=u_3,...v_n=u_n$ Thus the Jacobian is $1$ , we get the joint pdf as $\frac{(n-1)!}{2^{n-1}}e^{\frac{-v_2}{2}}$ What to do next? This is not close to a chi square pdf –  May 03 '18 at 10:01
  • Orton's still in WWE? If so, retiring soon? Or what? – BCLC May 03 '18 at 11:33
  • Lol, yeah he is still in WWE . Can you kill my doubts?! –  May 03 '18 at 11:40

1 Answers1

2

I think a more easy to follow (and simpler) proof would be to use a different change of variables.

We have the joint density of the order statistics $(U_1=X_{(1)},\cdots,U_n=X_{(n)})$

$$f_{\mathbf U}(u_1,\cdots,u_n)=n!\exp\left[-\sum_{i=1}^nu_i+n\theta\right]\mathbf1_{\theta<u_1<u_2<\cdots<u_n}$$

Now transform $(U_1,\cdots,U_n)\to(Y_1,\cdots,Y_n)$ such that $Y_i=(n-i+1)(U_i-U_{i-1})$ for all $i=1,2\cdots,n$ and taking $U_0=\theta$.

It follows that $\sum_{i=1}^nu_i=\sum_{i=1}^ny_i+n\theta$. The jacobian determinant comes out as $n!$.

So you get the joint density of $(Y_1,\cdots,Y_n)$

$$f_{\mathbf Y}(y_1,\cdots,y_n)=\exp\left[-\sum_{i=1}^ny_i\right]\mathbf1_{y_1,\cdots,y_n>0}$$

Not surprisingly, the spacings of successive order statistics from an exponential sample come out as independent . In fact, the $Y_i$'s are i.i.d exponential with mean $1$ for all $i=1,2,\cdots,n$.

This implies $2Y_i\stackrel{\text{i.i.d}}{\sim}\chi^2_2$ for all $i=1,2,\cdots,n$

So we have two independent variables $2Y_1$ and $\sum_{i=2}^n2Y_i$. Both have the chi-square distribution --- the former with $2$ degrees of freedom and the latter with $2n-2$ degrees of freedom.

It is now a matter of time to see that $2Y_1=2n(X_{(1)}-\theta)$ and $2\sum_{i=2}^nY_i=2\sum_{i=2}^n(X_{(i)}-X_{(1)})$.

StubbornAtom
  • 17,052
  • Independence of $\sum_{i=1}^n(X_i-X_{(1)})=\sum_{i=1}^n(X_{(i)}-X_{(1)})$ and $X_{(1)}$ can also be argued using Basu's theorem: https://math.stackexchange.com/q/3206984/321264. – StubbornAtom Jun 20 '20 at 06:24
  • Great answer. One question, it seems that $Y_i\sim \text{Gamma}(1,1)=2\text{Gamma}(2/2,1/2)$, thus $Y_i/2\overset{i.i.d}{\sim}\chi_2^2$, not $2Y_i$. – Tan Jan 13 '21 at 19:03
  • @Tan $Y_i\sim \text{Exp}(1)\implies 2Y_i\sim \text{Exp}(\text{mean }2)\equiv\chi^2_2 $. – StubbornAtom Jan 13 '21 at 19:32