My question takes a little bit of preamble: it concerns a well-known and solved problem, but I am looking for a solution with a particularly nice property.
$\newcommand{\matrix}[4]{\left( \begin{array}{cc} #1 & #2 \\ #3 & #4 \end{array} \right)} \DeclareMathOperator{\lcm}{lcm}$In using Guassian elimination to put a matrix into Smith normal form over $\mathbb{Z}$ (or, more generally, some PID), the last step is to make sure that successive diagonal entries divide each other. Solving this reduces to the following problem (with all matrices over $\mathbb{Z}$):
- given a diagonal matrix $M = \left( \begin{array}{cc} a & 0 \\ 0 & b \end{array} \right)$, find invertible matrices $L$, $R$ such that $$ M = L \left( \begin{array}{cc} \gcd(a,b) & 0 \\ 0 & \lcm(a,b) \end{array} \right) R $$
This is of course well-known and not hard to do. For instance, using the Euclidean algorithm to find $(x,y)$ such that $ax + by = d = \gcd(a,b)$, one can define
$$ L = \left( \begin{array}{cc} a/d & -y \\ b/d & x \end{array} \right), \quad R = \left( \begin{array}{cc} 1-yb/d & yb/d \\ -1 & 1 \end{array} \right)$$ or alternatively $$ L = \left( \begin{array}{cc} a/d & -1 \\ 1-xa/d & x \end{array} \right), \quad R = \left( \begin{array}{cc} 1-yb/d & b/d \\ -y & 1 \end{array} \right).$$
However, in the special case where $a$ divides $b$, we know that $M$ is already as desired and so we could simply take $L = R = I$. My question is: can we find a general algebraic solution like the ones above (i.e. algebraic definitions of $L$, $R$ in terms of the integers $(x,y,a/d,b/d)$), but with the additional property that when $a$ divides $b$ (and so $x=1$, $y=0$, $a/d=1$), the solution yields $L = R = I$? Roughly: can we find an algebraic solution which only does something non-trivial if it needs to?