Proof question.
I am asked to show $\gcd(a,b)=1$ implies that $\gcd(a+b,a-b)\leq 2$. I am not sure if my solution is correct.
Suppose $\gcd(a,b)=1$. Then, there are integers, x and y, such that $ax+by=1$. We claim there are integers $s$ and $t$ such that $(a+b)s+(a-b)t=2$. We can re-arrange this equation and obtain $a(s+t)+b(s-t)=2$.
Now, set $s=x+y$, and $t=x-y$, and we have that $a(x+y+x-y)+b(x+y-x+y) = 2ax + 2by = 2(ax+by)= 2.$
This implies that $\gcd(a+b,a-b)\vert 2$ and so $\gcd(a+b,a-b)\leq 2$
The part I am a little fuzzy on is whether or not setting $s=x+y$, and $t= x- y$ legitimately leads to a general solution. The idea was to obtain a particular solution to the equation, showing that for any $a,b$ we can construct an equation of the form $(a+b)s+(a-b)t=2$, but I am not sure if this is a move I am able to make here, it felt a bit like cheating to me.
solution-verification
question to be on topic you must specify precisely which step in the proof you question, and why so. This site is not meant to be an open-ended proof checking machine]. – Bill Dubuque Mar 05 '23 at 20:11