In the integers, it follows almost immediately from the division theorem and the fact that $a \mid x,y \implies a \mid ux + vy$ for any $u, v \in \mathbb{Z}$ that the least common multiple of $a$ and $b$ divides any other common multiple.
In contrast, proving $e\mid a,b \implies e\mid\gcd(a,b)$ seems to be more difficult. In Elementary Number Theory by Jones & Jones, they do not try to prove this fact until establishing Bezout's identity. This Wikipedia page has a proof without Bezout's identity, but it is convoluted to my eyes.
I tried my hand at it, and what I got seems no cleaner:
Proposition: If $e \mid a,b$, then $e \mid \gcd(a,b)$.
Proof: Let $d = \gcd(a,b)$. Then if $e \nmid d$, by the division theorem there's some $q$ and $c$ such that $d = qe + c$ with $0 < c < r$.
We have $a = k_1 d$ and $b = k_2 d$, so by substituting we obtain $a = k_1 (qe + c)$ and $b = k_2 (qe + c)$. Since $e$ divides both $a$ and $b$, it must divide both $k_1 c$ and $k_2 c$ as well. This implies that both $k_1 c$ and $k_2 c$ are common multiples of $c$ and $r$.
Now let $l = \operatorname{lcm}(r, c)$. $l$ divides both $k_1 c$ and $k_2 c$. Since $l = \phi c$ for some $\phi$, we have $\phi | k_1, k_2$, so $d \phi | a, b$.
But we must have $\phi > 1$ otherwise $l = c$, implying $r \mid c$, which could not be the case since $c < r$. So $d \phi$ is a common divisor greater than $d$, which is a contradiction. $\Box$
Question: Is there a cleaner proof I'm missing, or is this seemingly elementary proposition just not very easy to prove without using Bezout's identity?