Let's define $x=\sqrt{ab}$, where $a=-1$ and $b=-1$.
Does $x=1$, as $-1\cdot-1=1\implies\sqrt{-1\cdot-1}=\sqrt{1}=1$?
Or maybe $x=-1$, as $\sqrt{a^2}=a\implies\sqrt{-1\cdot-1}=\sqrt{(-1)^2}=-1$?
I assume that at least one of the following is true:
- The former solution is true, and the fallacy is in assuming $\sqrt{a^2}=a$ for a negative base $a$. This doesn't seem to hold, given that this is equivalent to saying that $a^{2^{\frac12}}\not=a^{2\cdot\frac12}=a$ for a negative $a$, where the value of $a$ seemingly has no bearing on the arithmetic performed with the exponents.
- The former solution is true, and the fallacy is in assuming the positive root is meant in the latter solution; in fact, $\sqrt{-1\cdot-1}=-\sqrt{(-1)^2}=--1=1$. But this implies that, depending on how the radicand is factored, one is forced to take the positive root for one factoring and the negative root for another factoring to get the same answer, which doesn't seem right.
- The latter solution is true, and the fallacy is in assuming the positive root is meant in the former solution; in fact, $\sqrt{-1\cdot-1}=-\sqrt1=-1$. But, in addition to the problem with the previous solution, this additionally means that $\sqrt{ab}=\sqrt{a}\sqrt{b}$ holds even for negative $a,b$.
Which of my reasonings is incorrect? Or are all of them correct, and there's an additional solution that I'm not seeing?