3

We have a $2\times2$ matrix with $\mathbb{Z}$ entries, $$M =\begin{bmatrix}i&j\\k&l\end{bmatrix}$$ with $\det(M) = 1$. If $(c\; d) = (a\; b)M$ then how do we show that $\gcd(a,b) = \gcd(c,d)$ ?


Multiplying out we get $ai + bk =c$ and $aj + bl = d$. assuming $a,b,c,d$ non-zero integers then by Bezout $\gcd(a,b) |c$ and $\gcd(a,b) |d$ so $\gcd(a,b) |\gcd(c,d)$

since $\det(M) = 1$ then $il - kj = 1$. by Bezout $\gcd(i,k) = \gcd(j,l) = 1$

That's all I have really, any help would be appreciated thanks.

Bill Dubuque
  • 272,048
tinky
  • 77

2 Answers2

3

The key point here is that if the determinant is $1$ then $$\begin{bmatrix}i&j\\k&l\end{bmatrix}^{-1}=\begin{bmatrix}l&-j\\-k&i\end{bmatrix}.$$ You've already shown that $\gcd(a,b)\mid\gcd(c,d)$; this fact allows you to run the same argument backwards.

2

$(c,d)=(a,b)M$ implies $c,d \in \mathbb Z a + \mathbb Z b = \mathbb Z \gcd(a,b)$.

Therefore, $\gcd(c,d) \in \mathbb Z \gcd(a,b)$ and so $ \mathbb Z \gcd(c,c) \subseteq \mathbb Z \gcd(a,b)$.

Repeat the argument with $M^{-1}$ (which has integer entries) to conclude the reverse inclusion.

lhf
  • 216,483