Let's say that we have sample $X = (X_1,\dots,X_n)$ from distribution given by: (or rather "derived from" or maybe "defied by"? How should I phrase it correctly in English? I would be grateful for an advice in comments :-))
$$f(x) = \frac{1}{\sigma}\exp\left\{-\frac{x-m}{\sigma} \right \}\mathbf{1}_{(m,\infty)}(x)$$
From this, we can write a density for whole sample
$$f(X) = \frac{1}{\sigma^n}\exp \left \{{\frac{mn}{\sigma}}\right\}\exp\left\{-\frac{n \overline{X}}{\sigma} \right \}\mathbf{1}_{(m,\infty)}(X_{1:n}),$$
where $X_{1:n} = X_{(1)} = \min\{X_1,\dots,X_n\}$ (not sure how it's usually denoted in English literature).
We can see from here (Factorization Theorem) that we need both $X_{1:n}$ and, for example, $\overline{X}$, for our sufficient statistics of parameter $\theta=(m,\sigma)$. However, how do we denote it?
Is it $T(X) = (X_{1:n},\overline{X})$, as $\overline{X}$ would be sufficient statistics for $m$, if we had known the $\sigma$, so we put it first? Or maybe the order doesn't matter and we are free to write $T(X) = (\overline{X},X_{1:n})$? Or maybe there is a reason to write it in the second form?
Extra question:
If we consider $\frac{f(X)}{f(Y)}$ for two samples $X$ and $Y$, we quickly get that this is minimal sufficient statistics. Can we somehow "quickly" tell, whether it's complete statistics, as we can do with certain types of exponential families (where we can apply Lehmann–Scheffé theorem)?
Are there some nice "tricks", when it goes to showing that there's no completeness with distributions like this one, i.e. when density's support depends on parameter?