0

Let a and b be relatively prime integers. Show that $\gcd(a^2+b^2,a+b)=$1 or 2

Proof: $s|a^2+b^2$ and $s|a+b$ implies $s|a^2+b^2$ and $s|(a+b)^2=a^2+b^2+2ab$ implies $s|a^2+b^2-(a+b)^2=2ab$ implies $s|2a$ and $s|2b$ implies $s|\gcd(2a,2b)=2gcd(a,b)=2*1$

Hence $\gcd(a^2+b^2,a+b)=1 or 2$

Git Gud
  • 31,356
  • 1
    In $s\mid 2ab\implies s\mid 2a\land s\mid 2b$, consider $a=3,b=1$ and $s=4$. – Git Gud May 26 '14 at 21:01
  • @GitGud Summer is upon us and my brain has already shut off. – Gamma Function May 26 '14 at 21:58
  • @GammaFunction I'll join you in 20 days. – Git Gud May 26 '14 at 22:01
  • Remember that you can also use this strategy: after you find out that $s\mid 2ab$, you can prove that $s\mid 2$. Suppose that $s\mid ab$. Then either $s\mid a$ or $s\mid b$. But since $s\mid a+b$, we would have $\begin{cases}s\mid a\implies s\mid b\ s\mid b\implies s\mid a\end{cases}$, but $\gcd(a,b)=1$, so that is a contradiction $\implies$ hence $s\mid 2$. That is in case you haven't figured that out on your own so far. – user26486 May 27 '14 at 00:04

2 Answers2

2

Hint $\,\ {\rm mod}\ a\!+\!b\!:\,\ a\equiv -b\,\Rightarrow\, \color{#0a0}{a^2\!+\!b^2\equiv 2b^2}\,$ so, by the Euclidean Algorithm $\rm\color{#c00}{(EA)}$

$\quad (a\!+\!b,\,\color{#0a0}{a^2\!+\!b^2})\overset{\rm\color{#c00}{EA}} = (a\!+\!b,\,\color{#0a0}{2b^2}) = (a\!+\!b,\,2)\ \ {\rm by}\ \ \Bigg\{ \begin{eqnarray} &&(a\!+\!b,b)\overset{\rm\color{#c00}{EA}}= (a,b)=1\\ \Rightarrow &&(a\!+\!b,b^2) = 1\end{eqnarray}\ \,$ by Euclid.

Here $\rm\color{#c00}{EA}$ means $\ (m,n)\, =\, (m,n')\ $ if $\ n'\!\equiv n\pmod m\,\ $ [= descent step of Euclidean Algorithm].

Remark $\ $ More generally one may prove $\ (a\!+\!b,\ a^2\!+\!b^2)\, =\,(a\!+\!b,\,2(a,b)^2)\ $ for all $\,a,b\in\Bbb Z.$

Bill Dubuque
  • 272,048
  • Or, by applying the Euclidean Algorithm in the manner you showed here, which gives $(a+b, a^2 + b^2) = (a + b, 2)$ – Anant May 27 '14 at 17:24
  • @Anant Yes, that was the intended answer, but somehow I mistakenly pasted an answer from another question I was working on. Time to clean out the gunk under my keyboard keys, which apparently nullified the copy (Ctrl+C) preceding the paste. Good thing you noticed that. Now fixed, with a link to a generalization. – Bill Dubuque May 27 '14 at 18:31
  • I see absolutely no problem! It was an instructive answer. – Anant May 27 '14 at 19:35
0

We use a result that can be proven fairly easily and in fact has already been posted here at MSE. But let's assume we haven't seen this post, and try to rediscover it...

claim: If $gcd(a,b) = 1$, then $gcd(a-b,a+b) = 1$ or $2$. This you can prove.

If $d = gcd(a-b,a+b)$, then if $d > 1$, you need to prove $d = 2$. If $d$ is odd, then $d|(a+b) + (a-b) = 2a$, and $d|(a+b) - (a-b) = 2b$, then $d|a$, and $d|b$, and then $d|(a,b) = 1$. So $d = 1$, contradiction. Thus $d$ is even, and write $d = 2d'$, then: $2d'|2a$, and $2d'|2b$ implying $d'|a$, and $d'|b$. So: $d'|(a,b) = 1$. Thus $d' = 1$, and this gives $d = 2d' = 2\times 1 = 2$.

Using this claim, we divide the problem in $2$ cases:

Case 1: $gcd(a-b,a+b) = 1$, then we need to prove: $gcd(a+b,a^2 + b^2) = 1$.

Let $d = gcd(a+b,a^2+b^2)$, then $d|a+b$, so $d|(a+b)^2$. But $d|a^2 + b^2$, so $d|(a+b)^2 - (a^2+b^2) = 2ab = (a^2+b^2) - (a-b)^2$. So: $d|(a-b)^2$. Thus: $d|gcd((a-b)^2,(a+b)^2) = 1^2 = 1$. Thus: $d = 1$.

Case 2: $gcd(a-b,a+b) = 2$, then we prove: $d = gcd(a+b,a^2 + b^2) = 2$.

Then $a-b = 2m$, and $a+b = 2n$ with $m$, and $n$ are integers such that $gcd(m,n) = 1$. Then using the above argument in case 1, $d|gcd(4m^2,4n^2) = 4\cdot gcd(m^2,n^2) = 4\cdot 1^2 = 4$. So if:

a) $d = 1$, then case 1 gives $1 = gcd(a+b,(a-b)^2) = gcd(2n, (2m)^2) = 2\cdot gcd(n,2m^2) \geq 2$. Contradiction.

b) $d = 4$, then $4|a+b$, and $4|a^2 + b^2$. This means both $a$ and $b$ must be even. So: $a = 2k$, and $b = 2s$. So: $a - b = 2(k - s)$, and $a + b = 2(k + s)$, and $2 = gcd(a-b,a+b) = gcd(2(k-s), 2(k+s)) = 2\cdot gcd(k-s,k+s)$. So: $gcd(k-s,k+s) = 1$. Also:

$4 = gcd(a+b,a^2+b^2) = gcd(2(k+s),4k^2 + 4s^2) = 2\cdot gcd(k+s,2(k^2+s^2))$. Thus:

$2 = gcd(k+s,2(k^2+s^2))$.

But from case 1 apply to $k$, and $s$ we have: $gcd(k-s,k+s) = 1 \to gcd(k+s, k^2+s^2) = 1$ . But $2|k+s$, means that $k+s$ is even, which implies $k^2 + s^2 = (k+s)^2 - 2ks$ is also even. So we can let $k+s = 2p$, and $k^2+s^2 = 2q$. Then: $1 = gcd(k+s,k^2+s^2) = gcd(2p,2q) = 2\cdot gcd(p,q) \geq 2$. Contradiction again.

So: $d = 2$, completing the proof.

DeepSea
  • 77,651