12

Is it true that if $\gcd(a,b,c)=1$ then there exists $x,y\in\mathbb{Z}$ such that $\gcd(a+xc,b+yc)=1$?

I came upon this while trying to prove that the natural homomorphism $r_m:\operatorname{SL}_n(\mathbb{Z})\to\operatorname{SL}_n(\mathbb{Z}/m\mathbb{Z})$ is surjective. I was trying to show that for $n=2$, if $A\in\operatorname{SL}_2(\mathbb{Z}/m\mathbb{Z})$ then it suffices to show that there exists $B\in M_2(\mathbb{Z})$ such that $r_m(B)=A$ and $\gcd(b_{11},b_{12})=1$.

  • Does not $x=y=0$ work? – David P May 02 '19 at 05:53
  • To the above, let $a=3,b=6,c=2$. Then $\gcd(a,b,c)=1$, but $\gcd(a,b)=3$. Hence, $x=y=0$ won't work. – LuuBluum May 02 '19 at 05:55
  • Since $\gcd(a,b,c) = 1$, by Bezout's identity, there exists $r,s,t \in \mathbb{Z}$ such that $ra + sb + tc = 1$. If we can find $x, y \in \mathbb{Z}$ such that $rx + sy = t$, then $1 = ra + sb + (rx + sy)c = r(a + xc) + s(b + yc)$. Since $1$ divides $a + xc$ and $b + yc$, and any other divisor of $a + xc$ and $b + yc$ would have to divide $r(a + xc) + s(b + yc)$, we conclude $\gcd(a + xc, b + yc) = 1$, as desired. The trouble is, I don't think we can guarantee that $x$ and $y$ actually exist. – Charles Hudgins May 02 '19 at 06:20
  • I should note that the converse definitely is true, though. If $\gcd(a + xc, b + yc) = 1$ for some $x,y \in \mathbb{Z}$, then, by Bezout's identity, there exist $r,s \in \mathbb{Z}$ such that $r(a + xc) + s(b + yc) = 1$. Hence $ra + sb + (rx + sy)c = 1$. Since $1$ divides $a,b,c$ and any other divisor of $a,b,c$ divides $ra + sb + (rx + sy)c = 1$, we conclude that $\gcd(a,b,c) = 1$. – Charles Hudgins May 02 '19 at 06:23

2 Answers2

4

If the highest common factor of $a$ and $c$ is $d$, so that $a=pd$ and $c=qd$ with $p$ and $q$ co-prime, then $a+xc=d(p+xq)$.

We know that $d$ is co-prime to $b$, and Dirichlet's theorem on primes in arithmetic progression tells us that $p+qx$ is a prime infinitely often. But $b$ only has a finite number of prime factors.

So in fact we can do this with $y=0$.

Whether Dirichlet's theorem is necessary for this, I don't know off the top of my head. It feels like there ought to be something simpler. But this at least answers the question.

Mark Bennet
  • 100,194
0

We assume $c\neq 0$, else there is nothing to do.

Let $d=\gcd(a,b)$ and write $a=md,b=nd$. There exists integers $u,v$ such that $\gcd(u,v)=1$ and $$ umd + vnd = d $$ Therefore $$ \begin{align*} (umd + uvc) + (vnd - uvc) &= d\\ u(a+vc) + v(b-uc) &= d \end{align*} $$ First assume $u,v$ are non-zero.

We claim that $D=\gcd(a+vc,b-uc)=1$.

Suppose otherwise, let $p$ be a prime dividing $D$. Then $p$ divides $d$, hence $p$ divides $a$ and $b$.

Since $p$ also divides $a+vc,b-uc$, this shows that $p$ divides $vc,uc$. By assumption $\gcd(a,b,c)=1$, hence $p$ cannot also divide $c$. This means that $p$ divides $u,v$, contradicting $\gcd(u,v)=1$.

Therefore $D$ must be equal to 1.


If $u=0$, then $v=n=1$ and $a=md, b =d$. We may set $u,v$ to $u=-1, v=1+m$ so that $u,v$ are nonzero and $$ ua +vb = -md + (1+m)d = d $$ Similarly for $v=0$.

Yong Hao Ng
  • 4,795
  • 19
  • 26