2

I wish to prove the claim:

If $\gcd(a,b) =1$ then $\gcd(2a +b, a + b) = 1$.

I have so far:

By Bezouts lemma, there are integers $x,y$ such that $ax+by = 1$.

So, $d = \gcd(2a+b, a+ b)$,

then $d|(2a +b)$ and $d|(a+b)$.

I am not sure what to do after this. I have to use GCD Characterization to prove this.

2 Answers2

2

set $\delta:=\text{gcd}(2a+b,a+b)$, then $\delta\mid(2a+b)$ and $\delta \mid(a+b)$, then $\delta$ divides every integer linear combination of them, hence: $\delta \mid (2a+b)-(a+b)= a;\;\delta\mid 2(a+b)-(2a+b)= b$, thus $\delta \mid \gcd(a,b)= 1$, i.e, $\delta=1$, which was to be demonstrated.

I just used these properties; if $d=\gcd(x,y)$, then

  1. $d\mid\alpha x + \beta y\quad \forall \alpha,\beta\in\mathbb{Z}$.
  2. $(d\mid r \text{ and } d\mid t)\implies d\mid\gcd(r,t)\quad \forall r,t\in\mathbb{Z}$.
0

There is a simple Bezout in this case:

$$-2(2a+b)+3(a+b)$$

$$=-a+b$$

$$=1$$

by Bezout on $\gcd(a,b)=1$.

JMP
  • 21,771