In one of my undergrad theory or algorithms classes, I remember a professor sharing a quip that went something like
In practice, $\log(\log(N))$ is at most 9.
...the idea being that even though the function $f(N) = \log(\log(N))$ technically grows without bound, it's tiny for any value of $N$ that's remotely plausible for nearly any real-world problem, so an $O(\log(\log(N))$ algorithm is effectively constant-time. For example, $\log_2(\log_2($number of particles in the universe$)) \approx 8$. (It's even smaller if you use base $e$ or base 10.)
Does anyone know who first (or most famously) said this?
Googling this with various parts in quotes turns up plenty of lecture notes and tutorials about big-O notation and how slowly $\log(N)$ and $\log(\log(N))$ grow, but so far the earliest example I've found is this lecture by Richard Borcherds in 2021, which is several years later than my memory of hearing it (and doesn't cite anyone else as the source of the remark).