0

Prove that if $a, b \in \mathbb{R}$ are positive, then $a \leq b$ implies $\sqrt a \leq \sqrt b$.

I have seen proof for this problem at a couple of places. People have used the property $0 \leq a-b = (\sqrt a + \sqrt b)(\sqrt a - \sqrt b)$ and dividing this by $(\sqrt a + \sqrt b)$ gets the result. But I don't think this is the right proof because if even though $a,b$ are positive but square roots of positive numbers could be negative. So, $(\sqrt a + \sqrt b) $ could be negative as well. So, dividing by $(\sqrt a + \sqrt b)$ could also lead to $\sqrt a \geq \sqrt b $. Am I missing something here?

KReiser
  • 65,137
Geet Thakur
  • 129
  • 6
  • 4
    The square root of a positive number is positive ! – Fred Oct 13 '21 at 14:07
  • 1
    By convention, the notation $\sqrt{x}$ usually denotes the nonnegative square root. – Joe Oct 13 '21 at 14:08
  • https://math.stackexchange.com/questions/26363/square-roots-positive-and-negative – Martin R Oct 13 '21 at 14:16
  • $\frac{d}{dx}(\sqrt{x}) = \frac{1}{2\sqrt{x}}$. For $x>0$ this is positive so the square root function is always increasing, so this is simply the definition of increasing. Increasing functions always preserve the inequality while decreasing ones reverse it. – Square Oct 13 '21 at 14:20

1 Answers1

1

You are right since by your hipothesis $a,b> 0$ and $b-a> 0$ but $b-a=(\sqrt{b}-\sqrt{a})(\sqrt{b}+\sqrt{a})> 0$ and now the square root of a positive number is positive and then $(\sqrt{b}+\sqrt{a})\neq 0$ as well and hence $\sqrt{b}-\sqrt{a}> 0$ wich complete the proof.

Notice that if $a=b=0$ the inequality holds.