Let $X_1,X_2,\ldots,X_n$ be a random sample from exponential distribution with mean $1.$
Then joint probability density of order statistics $X_{(1)},X_{(2)},\ldots,X_{(n)}$ is
$$f_{X_{(1)},X_{(2)},\ldots,X_{(n)}}(x_1,x_2,\ldots,x_n)= n! e^{-\sum_{i=1}^{n}x_i}, 0\leq x_1\leq x_2\leq \cdots \leq x_n \leq \infty$$
Let us consider transformation
$$Y_1=nX_{(1)}, Y_2=(n-1)(X_{(2)}-X_{(1)}), Y_3=(n-2)(X_{(3)}-X_{(2)}),\ldots,Y_n= X_{(n)}-X_{(n-1)}$$
$$\Rightarrow X_{(1)}=\frac{Y_1}{n}, X_{(2)}=\frac{Y_1}{n}+\frac{Y_2}{n-1},\ldots, X_{(n)}=\frac{Y_1}{n}+\frac{Y_2}{n-1}+\frac{Y_3}{n-2}+\cdots+Y_n$$
Jacobian of above transformation is $\frac{1}{n!}$.
So joint probability density function of $Y_1,Y_2,\ldots,Y_n$ is given by
$f_{Y_1,Y_2,\ldots,Y_n}(y_1,y_2,\ldots,y_n)= e^{-\sum_{i=1}^n y_i}; 0\leq y_1,y_2,\ldots,y_n\leq \infty $.
This follows, using factorization theorem, $Y_1,Y_2,Y_3,\ldots,Y_n$ are identically and independently distributed as exponential variate with mean $1.$
$\Rightarrow Y_i=(n-i+1)(X_{(i)}-X_{(i-1)}) \stackrel{\text{iid}}{\sim} \operatorname{exp}(1)$; $i=2,3,\ldots,n$.
Hence $\sum_{i=2}^{n} Y_i= \sum_{i=1}^n(X_i-X_{(1)})$ is sum of $(n-1)$ independent $exp(1)$ variates, so $\sum_{i=1}^n(X_i-X_{(1)})\sim \operatorname{gamma}(n-1)$.
Ref: "Order Statistics & Inference" by Balakrishnan & Cohen.
https://www.amazon.com/Order-Statistics-Inference-Estimation-Methods/dp/149330738X