It is an exercise on the lecture that i am unable to prove.
Given that $gcd(a,b)=1$, prove that $gcd(a+b,a^2-ab+b^2)=1$ or $3$, also when will it equal $1$?
It is an exercise on the lecture that i am unable to prove.
Given that $gcd(a,b)=1$, prove that $gcd(a+b,a^2-ab+b^2)=1$ or $3$, also when will it equal $1$?
HINT:
Let prime $p$ divides $a+b, a^2-ab+b^2$
$\implies p$ divides $\{(a+b)^2-(a^2-ab+b^2)\}=3ab$
If $p|a,$ as $p|(a+b),p$ must divide $(a+b)-a=b\implies p|(a,b)$
But as $(a,b)=1,p$ can not divide $a$
Similarly, $p$ can not divide $b$
$\implies p|3$
$\implies (a+b,a^2-ab+b^2)|3$
and $(a+b,a^2-ab+b^2)=3$ if $3|(a+b)\iff 3|(a^2-ab+b^2)$
Let $d|a+b \quad(1)$ and $d|a^2-ab+b^2 (2)$
$ d|a^2-ab+b^2 \quad (2)\implies d|(a+b)^2 - 3ab \qquad \qquad(3)$
And as $d|a+b$ then, $(3) \land (1) \implies d|-3ab \implies d|3ab \implies d|3 \vee d|a \vee d|b$
On the other hand, as $d|a+b$, if $d|a \implies d|b$, and vice verse. And as $gcd(a,b)=1$, then $d=1$.
The other possibility was $d|3$, and it's clear that $d|3 \Longleftrightarrow (d=1 \vee d=3)$