0

Show that if $gcd(x, y) = 1$ then $gcd(x − y, x + y)$ is either $1$ or $2$

I think the question is asking me to show if the $\gcd(x,y)$ then the $\gcd(x-y,x+y)$ is $1$ or $2$.

so, for the first bit. Let $d=\gcd(a,b)d=\gcd(a,b)$; by definition there are integers $a'a′$ and $b'b′$ such that $a=a'da=a′d$ and $b=b'db=b′d$, so $a'dx+b'dy=da′dx+b′dy=d$. Dividing through by $dd$, then $a'x+b'y=1′x+b′y=1$.

Let $e=gcd(x,y)e=gcd(x,y)$. As before, there are integers $x'x′$ and $y'y′$ such that $x=ex'x=ex′$ and $y=ey'y=ey′$. Substituting these into the previous equation, we get $a'ex'+'ey'=1a′ex′+′ey′=1$, or $e(a'x'+b'y')=1e(a′x′+b′y′)=1$. Since $a'x'+b'y'a′x′+b′y′$ is an integer, this implies that $e=1$ or $e=−1$: these are the only divisors of $11$. But $e$ is a greatest common divisor and hence by definition positive, so $e=1$

1 Answers1

1

Let $u=x+y, v=x-y.$ Suppose $(u, v) = a.$ Then, $2 x = (u+v),$ and $2y = (u-v),$ so $a$ divides the gcd of $2x$ and $2y.$ But since $(2 x, 2 y) = 2(x, y) = 2,$ the result follows.

Igor Rivin
  • 25,994
  • 1
  • 19
  • 40