Let $X_1, \ldots, X_n$ be iid with common pdf $$ f(x;\theta,\gamma) = \frac{\theta x^{\theta-1}}{\gamma^{\theta}}1(0<x<\gamma);\,\, \theta, \gamma >0$$ and let $T_1, \ldots, T_n$ be the corresponding order statistics. Further let $\displaystyle (S,T) = \left(\prod_{i=1}^{n-1}T_i,T_n\right)$ Show $(S,T)$ is sufficient for this model.
I have two questions:
- Solving this:
the joint pdf is \begin{align} & f(\vec{t};\theta\gamma) \\[8pt] = {} & n!\frac{\theta t_1^{\theta-1}}{\gamma^{\theta}}1(0<t_1<\gamma)\cdots \frac{\theta t_n^{\theta-1}}{\gamma^{\theta}}1(0<t_n<\gamma) \\[8pt] = {} & n!\frac{\theta^n(t_1\cdots t_n)^\theta}{\gamma^{n\theta}(t_1\cdots t_n)}1(0<t_1<\cdots<t_n<\gamma) \\[8pt] = {} & n!\frac{\theta^n(s\cdot t)^\theta}{\gamma^{n\theta}(s \cdot t)}1(0<t_1<\cdots<t_n<\gamma) \\[8pt] = {} & \frac{n!}{s\cdot t}\cdot \frac{\theta^n(s\cdot t)^\theta}{\gamma^{n\theta}}1(0<t_1<\cdots<t_n<\gamma) \\[8pt] = {} & h(\vec{t})g(s,t; \theta,\gamma) \end{align}
This shows the sufficiency by the Factorization theorem, correct?
- My second question has to do with conditional probability--specifically when working with a continuous distribution so I will use the above. By definition we achieve sufficiency if we can show the conditional distribution doesn't depend on the given parameters. I always have difficulty computing a conditional distribution in these instances. Namely, I know that $\displaystyle f(\vec{t} ; (s,t) ) = \frac{f_{\vec{T},(S,T)}(\vec{t},(s,t))}{f_{(S,T)}(s,t)} $
But can someone show me to how to actually derive this in this case, I know that $f_{(S,T)}(s,t)$ will follow from doing a transformation of random variable method. The issue I run into is trying to compute what $f_{\vec{T},(S,T)}(\vec{t},(s,t))$ would be.