Say we have $U_1 \dots U_n$ i.i.d. random variables uniform on $[0,1]$ and $Y_1 \dots Y_{n+1}$ i.i.d. random variables distributed as $Y_i \sim Exp(1)$. I know that the joint distribution of the order statistics $(U_{(1)}, \dots, U_{(n)})$ is equal in distribution to $(\frac{Y_1}{\sum_i^{n+1} Y_i}, \frac{Y_1+Y_2}{\sum_i^{n+1} Y_i}, \dots, \frac{Y_1+\dots+Y_n}{\sum_i^{n+1} Y_i})$. I can prove the result using a transformation of variables, Jacobians, etc., but this is rather tedious. Is there a more elegant way of deriving this statement? Maybe something with poisson processes?
1 Answers
You are right that you can think of this problem in terms of Poisson processes. The first arrival time after $t = 0$ as well as the inter-arrival times in a Poisson process of rate $1$ are independent $\text{Exp}(1)$ random variables and so $Y_1$, $Y_1 + Y_2$, $\ldots$, $Y_1 + Y_2 + \cdots + Y_{n+1}$ can be taken to be the times of the first, second, $\ldots$, $(n+1)$-th arrivals after $t = 0$ in the process. The random variables $\frac{Y_1}{\sum_i^{n+1} Y_i}, \frac{Y_1+Y_2}{\sum_i^{n+1} Y_i}, \dots, \frac{Y_1+\dots+Y_n}{\sum_i^{n+1} Y_i}$ that you are looking at are the first $n$ arrival times "normalized" to a unit interval.
For $0 < t_1 < t_2 < \dots < t_n < 1$, the conditional probability that there is one arrival in each interval $(t_i, t_i + \Delta t_i)$ and none in the remaining time of total length $(1 - \sum_i \Delta t_i)$ given that there are $n$ arrivals in $(0, 1)$ is approximately $$ \begin{align*} \frac{\exp(-(1 - \sum_i \Delta t_i)) \prod_{i=1}^n \exp(-\Delta t_i)\Delta t_i/1!}{\exp(-1)\frac{1^n}{n!}} &= n! \Delta t_1\Delta t_2 \cdots \Delta t_n\\ &= f_{U_{(1)}, \dots, U_{(n)}}(t_1, t_2, \ldots , t_n)\Delta t_1\Delta t_2 \cdots \Delta t_n \end{align*} $$

- 25,197