5

Problem:

Prove that if gcd( a, b ) = 1, then gcd( a - b, a + b ) is either 1 or 2.

From Bezout's Theorem, I see that am + bn = 1, and a, b are relative primes. However, I could not find a way to link this idea to a - b and a + b. I realized that in order to have gcd( a, b ) = 1, they must not be both even. I played around with some examples (13, 17), ...and I saw it's actually true :( ! Any idea?

Bill Dubuque
  • 272,048
roxrook
  • 12,081

3 Answers3

11

The gcd of $x$ and $y$ divides any linear combination of $x$ and $y$. And any number that divides $r$ and $s$ divides the gcd of $r$ and $s$.

If you add $a+b$ and $a-b$, you get <blank>, so $\mathrm{gcd}(a+b,a-b)$ divides <blank>.

If you subtract $a-b$ from $a+b$, you get <blankity>, so $\mathrm{gcd}(a+b,a-b)$ divides <blankity>.

So $\mathrm{gcd}(a+b,a-b)$ divides $\mathrm{gcd}($<blank>,<blankity>$) = $<blankety-blank>.

(For good measure, assuming the result is true you'll want to come up with examples where you get $1$ and examples where you get $2$, just to convince yourself that the statement you are trying to prove is the best you can do).

Arturo Magidin
  • 398,050
  • @Arturo Magidin: Haha, I got it. Much simpler than I thought. Thanks a lot! – roxrook Feb 01 '11 at 04:33
  • By the way, I can't go from gcd( a + b, a - b ) right? since this is inverse error if A then B, does not guarantee B then A. Should I prove it by contradiction instead? – roxrook Feb 01 '11 at 04:36
  • @Chan: I'm not sure what you mean... You cannot begin by assuming that gcd(a+b,a-b) is equal to 1 or to 2, but you do not need to make any assumption. The hints above give you information about what gcd(a+b,a-b) must divide, and you have other information (remember what you are told about $a$ and $b$). All of that together should be sufficient (PEV was more explicit, and you seemed to think it was a great hint). – Arturo Magidin Feb 01 '11 at 04:46
  • @Arturo Magidin: All I tried to say is that, when proving if A then B. If we assume B is true, then infer A is true -> this is wrong. – roxrook Feb 01 '11 at 04:46
  • Yes. You should never affirm the consequent. If you assume B is true and infer A is true, then you prove $B\rightarrow A$, which may be something interesting, but is not equivalent to $A\rightarrow B$. – Arturo Magidin Feb 01 '11 at 04:47
  • Thanks again for the confirm. – roxrook Feb 01 '11 at 04:49
  • See here for a generalization using determinants and here using norms. – Bill Dubuque Jun 13 '11 at 16:46
  • what does that mean , ?? – Darío A. Gutiérrez Oct 13 '16 at 18:08
2

Note that $d|(a-b)$ and $d|(a+b)$ where $d = \gcd(a-b, a+b)$. So $d$ divides the sum and difference (i.e. $2a$ and $2b$).

2

Hint $\rm\,\ a\!-\!b + (a\!+\!b)\ {\it i}\ =\ (1\!+\!{\it i})\ (a\!+\!b\!\ {\it i})\ \ $ yields a slick proof using Gaussian integers. This reveals the arithmetical essence of the matter and, hence, suggests obvious generalizations.

Bill Dubuque
  • 272,048