I'm just wondering the reason behind being able to do this:
$$\gcd(a,c) \cdot \gcd(b,c) = \gcd(ab,bc,ac,cc)$$
Is there an elementary proof to the reasoning behind this?
I'm just wondering the reason behind being able to do this:
$$\gcd(a,c) \cdot \gcd(b,c) = \gcd(ab,bc,ac,cc)$$
Is there an elementary proof to the reasoning behind this?
It is an immediate consequence of expanding using the GCD distributive and associative $\rm\color{#c00}{laws}$, analogous to how we expand $\,(a\!+\!c)(b\!+\!c)$ using the same laws for integer arithmetic, i.e.
$$\begin{align}\color{#0a0}{(a,c)}(b,c) &\,=(\color{#0a0}{(a,c)}b,\ \ \color{#0a0}{(a,c)}c)\ \ \ \text{via Distributive Law}\\[.1em] &\,= ((ab,cb),(ac,cc))\ \ \text{via Distributive Law}\\[.1em] {\rm so}\,\ \ (a,\,\ c)(b,\,\ c)&\,= (\,ab,\ bc,\ \ \,ac,cc)\ \ \ \ \text{via Associative Law}\\[.2em] \text{It's a gcd analog of } \ (a\!+\!c)(b\!+\!c) &\,= \ \,ab\!+\!bc\!+\!ac\!+\!cc,\ \text{by both satisfy said }\rm\color{#c00}{laws}\\[.3em] \text{It is exactly that }\:\!\rm (A\!+\!C)(B\!+\!C)&\rm \,=AB\!+\!BC\!+\!AC\!+\!CC\ \text{ as ideals: } A=(a)\rm\ etc \end{align}\quad$$
In summary, the GCD operation behaves just like addition in integer arithmetic - it is associative and commutative and multiplication distributes over it, so we can perform GCD arithmetic analogously to integer arithmetic. To better facilitate the analogy it may help to denote the gcd operation by an infix addition symbol, e.g. $\oplus$ as in this answer, so the OP identity becomes
$$(a\oplus c)(b \oplus c)\, =\, ab \oplus bc \oplus ac \oplus cc\qquad\ \ \ \ $$
Other proofs using Bezout are not only more complex but also less general since they don't apply in more general gcd domains where Bezout fails, e.g. polynomial rings like $\,\Bbb Z[x]\,$ and $\,\Bbb Q[x,y]$.
Ditto for proofs using (unique) prime factorization, since there are gcd domains that are not UFDs, e.g. the ring of all algebraic integers is Bezout so a gcd domain, but it has no irreducibles so no primes since e.g. $\,\alpha = \sqrt \alpha \sqrt \alpha$.
A nice exercise using the above is the proof of the "Freshman's Dream" GCD Binomial Theorem
$$\begin{align} (a\oplus b)^n &=\ a^n\oplus\, b^n\\[.3em] {\rm i.e}\ \ \ \gcd(a,b)^n &= \gcd(a^n,b^n)\end{align}\qquad\qquad\qquad\ $$
For $p$ a prime, and $n$ an integer, write $n_p$ for the power of $p$ dividing $n$, that is, for the nonnegative integer $\nu$ such that $p^{\nu}$ divides $n$ but $p^{\nu+1}$ doesn't.
The power of $p$ dividing $\gcd(a,c)$ is $\min(a_p,c_p)$; similarly for $\gcd(b,c)$; so the power of $p$ dividing $\gcd(a,c)\gcd(b,c)$ is $\min(a_p,c_p)+\min(b_p,c_p)$, which is $\min(a_p+b_p,a_p+c_p,b_p+c_p,2c_p)$, which is $\min((ab)_p,(bc)_p,(ac)_p,(c^2)_p)$, QED.
I’ll leave it to you to prove the left side divides the right slide.
To prove the other way, solve $$ax+cy=\gcd(a,c)$$ and $$bw+cz=\gcd(b,c)$$ for integers $x,y,w,z.$
Multiply together and you get:
$$ab(xw)+ac(xz)+bc(wy)+cc(yz) =\gcd(a,c)\cdot \gcd(b,c)$$
Use this to prove the right side is a divisor of the left side.
More generally you can prove:
$$\gcd(a,c)\cdot \gcd(b,d)=\gcd(ab,ad,bc,cd)$$
We can also use the simpler lemma:
Lemma: If $m,n,p$ are non-zero integers then $$\gcd(pm,pn)=p\gcd(m.n)$$ Proof: Solve $mx+ny=\gcd(m,n)$ then you get $(pm)x+(pn)y=p\gcd(m.n),$ therefore the left side divides the right side.
On the other hand, $\gcd(m,n)\mid m$ means $p\gcd(m.n)\mid pm.$ Likewise, $p\gcd(m,n)\mid pn,$ so the right side divides the left side.
Applying this several times gives:
$$\begin{align}\gcd(a,c)\cdot\gcd(b,d)&=\gcd(a\gcd(b,d),c\gcd(b,d))\\ &=\gcd\left(\gcd(ab,ad),\gcd(cb,cd)\right)\\ &=\gcd(ab,ad,cb,cd) \end{align}$$
Just an example showing that it may not hold for other rings. Start with the favorite countexample ring $\mathbb{Z}[\sqrt{-5}]$. The elements $1\pm \sqrt{-5}$, $2$ have no common factor ($\ne \pm 1$), so we have $$\operatorname{gcd}( 1\pm \sqrt{-5},2) = 1$$ but $$\operatorname{gcd}( (1+\sqrt{-5})(1-\sqrt{-5}), (1+ \sqrt{-5})2,(1- \sqrt{-5})2, 2\cdot 2 ) \ne 1$$
Here the usual notion of gcd does not behave well.