Prove that if $a, b \in \mathbb{R}$ are positive, then $a \leq b$ implies $\sqrt a \leq \sqrt b$.
I have seen proof for this problem at a couple of places. People have used the property $0 \leq a-b = (\sqrt a + \sqrt b)(\sqrt a - \sqrt b)$ and dividing this by $(\sqrt a + \sqrt b)$ gets the result. But I don't think this is the right proof because if even though $a,b$ are positive but square roots of positive numbers could be negative. So, $(\sqrt a + \sqrt b) $ could be negative as well. So, dividing by $(\sqrt a + \sqrt b)$ could also lead to $\sqrt a \geq \sqrt b $. Am I missing something here?