0

If $a^2+b^2= [a+b+(2ab)^{1/2} ]\cdot[a+b-(2ab)^{1/2} ]$ why do people say it can't be factorized? In fact, any $N$ multiple of $2$ in $a^n+b^n$ cannot be factored. Maybe I am lost in the definition of factorization.

Frank
  • 5,984
JIVP
  • 351
  • 3
    This is because the element $\sqrt{2ab}$ is not a polynomial, it contains a non-integral power of both $a$ and $b$. Factorization is the expression of a polynomial as a product of polynomials of smaller degree (if the polynomial is not irreducible). – Sarvesh Ravichandran Iyer Sep 08 '16 at 02:52
  • In broader therms, this is true when $ab>0$, but this isn´t considered as factorization, it is just a way to rewrite things. – omega-stable Sep 08 '16 at 02:56
  • 3
    In broader terms, whether $ab > 0$ or $< 0$ is irrelevant. It is a factorization in the ring $\mathbb Q[a,b,\sqrt{2ab}]$. It is not a factorization in any ring that doesn't contain $\sqrt{2ab}$. – Robert Israel Sep 08 '16 at 02:59
  • Interestingly, to turn this into a factorization you could aim to make $\sqrt{2ab}$ a monomial. A common way of doing this is to replace $a$ with $a^2$, and $b$ with $2b^2$. Then you obtain the factorization $a^4+4b^4 = (a^2 + b^2 + 2ab)(a^2 + b^2 - 2ab)$. – Peter Huxford Sep 08 '16 at 03:39
  • Related: https://math.stackexchange.com/questions/1445089, https://math.stackexchange.com/questions/1265201, https://math.stackexchange.com/questions/695266 – Watson Feb 03 '18 at 20:36

0 Answers0