I found myself needing to prove this statement when attempting to prove that every common divisor divides the GCD. However, I just cannot think of a way to prove this! The logical framework I'm working in does not yet regard the Bezout Identity nor the Euclidean algorithm as true, it only assumes the following:
- $a$ divides $b$ if there exists an integer $n$ such that $b = an.$
- For every integer $m$ and $n \in N \backslash \{0\}$, there exist unique integers $q$ and $r$ with $m = qn + r$ and $0 ≤ r < n.$
- $\text{gcd}(a,b)$ is defined to be the largest common divisor of $a$ and $b$.
Any tips? I'm very stuck...