I'm more familiar with the saying as "a stopped clock is right twice a day", which is obviously correct for a standard 12 hour analog clock (with a possible exception involving leap-seconds).
As for your question, what I would consider to be the key insight is that, if you have two clocks running at (different) constant speeds, $\tau$, $\omega$, then the difference between them also changes at a constant rate, $|\tau-\omega|$. It follows immediately that, if the first clock is correct and working properly, then $\tau=1$, and the difference of the other from the correct time, $t$, is given by $|1-\omega|t$.
Since clocks are periodic, if the broken clock is correct at $t=0$, it will also show the correct time at any $t$ such that $|1-\omega|t=(12hours)\times k$, or $t=\frac{(12 hours)\times k}{|1-\omega|}$ (for $k\in\mathbb{Z}$). The period of that, $T$, is clearly given by $T={12hours\over|1-\omega|}$, and, since $\omega$ can be made as close as you choose to $1$, $T$ can be made as close to $\infty$ as you wish.
That also makes intuitive sense. If a clock loses or gains time very slowly - say a second per year - then it also takes a very long time for that error to be noticeable, and a much, much longer time before that error accumulates to be 12 hours.
Meanwhile, a clock that's running at a high enough speed (forward or backward) will display every time multiple times per second, including the correct time, so, at least in theory, for any short, finite time interval, $\delta t$, you can describe a clock that runs continuously at a fixed speed, is correct at some instant during the interval $[t,\delta t)$ for any time $t$, and which is completely useless in practice.