Let $X_1$ and $X_2$ be independent, $\text{Exp}(a)$-distributed random variables. Show that $X_{(1)}$ and $X_{(2)}−X_{(1)}$ are independent, and determine their distributions.
Although it looks like a duplicate(Old question) I have a different question. I'm more concerned about the bounds of integration.
I found the joint to be: $$f_{U, V}(u,v)=2a^2e^{-a(2u+v)}, \quad \text{ with } \quad U=X_{(1)} , V= X_{(2)} - X_{(1)} $$
Hypothetically speaking, I think that the bounds are found by writing: $X_{(1)}=U$ then I know automatically that $0\le U\le\infty$.
But for $V$, I think I do a similar thing with $X_{(2)}=V+U$, $ 0\le V+U \le \infty$ $ \implies -V\le U \le \infty $
Sorry about the fundamentality of this question, I feel like because of that I'm getting:
$X_{(2)} - X_{(1)}\sim \text{Exp}(a)$ which is true, but for $f_U(u)=2ae^{-au}$ instead of $f_U(u)=\frac{a}{2}e^{-\frac{a}{2}u}$
Questions:
How to find the bounds of integration? Is my way of doing it correct? Furthermore, how do you work with this stuff without having to do the transformations with the jacobians?