I see how the MLE of the upper bound for a uniform distribution is obtained, given $n$ draws are observed.
Now I wonder the following: What is the maximum likelihood estimator of $b$, given $n$ draws from the minimum distribution of two values from $U[0,b]$ are observed?
I would start with specifying the minimum distribution:
$$ F_{\min}(x;b ) = \begin{cases} 0 & \text{if } x < 0 \\ 1-(1-F(x))^2 & \text{if } 0 \le x \le b \\ 1 & \text{if } x > b \end{cases}$$
from which we can derive the pdf
$$ f_{\min}(x;b) = \begin{cases} 0 & \text{if } x < 0 \\ \frac{2(b-x)}{b^2} & \text{if } 0 \le x \le b \\ 0 & \text{if } x > b \end{cases}$$
Thus the log likelihood function is:
$$ L(b) = \sum_{i=1}^n \log f(x_i; b) $$
Similar to the standard case, we need that $\hat b \ge \max_n \{x_n\}$; but moreover in the interior it has to hold that
$$ \frac{\partial L(b)}{\partial b} = \sum_{i=1}^n - \frac{2(b-x_i)}{b^3} = 0 $$
which is solve by $b = \frac{2 \sum_{i=1}^n x_i}{n} $.
Therefore I would obtain:
$$ \hat b = \max\left( \max_n \{x_n\}, \frac{2 \sum_{i=1}^n x_i}{n} \right)$$
Is this correct?