Let $A$ and $B$ be two real, $n\times n$ matrices. Using Hadamard's inequality, it is not hard to show that $$ \left|\det A - \det B \right| \leq \|A-B\|_{2} \frac{\|A\|_{2}^n -\|B\|_{2}^n}{\|A\|_2 -\|B\|_2}. $$ Where $\|A\|_2=\sqrt{\sum_{i,j}a_{ij}^2}$. From this, I can derive a sup bound, for example $$ \left|\det A - \det B \right| \leq n^{n+1} \|A-B\|_{\infty} \max (\|A\|_{\infty}^{n-1},\|B\|_{\infty}^{n-1}). $$ Where $\|A\|_\infty=\sup_{i,j}|a_{ij}|$.
The constant $n^{n+1}$ is not the best bound possible : any reference (or proof) for a better (or the best) one? I show below that one can obtain $n^2(n-1)^{n-1}$, but that isn't much better. I just tried $10^5$ random matrices on Maple and obtained a maximal constant (much) smaller than one : this is not a proof, but it looks like there is room for improvement nevertheless.
Just for completeness (and in case someone sees a factor I missed), to get the first bound, writing $A=[A_1,\ldots,A_n]$ in terms of its column vectors, an expansion shows \begin{eqnarray*} \det A &=& \det (A_1 -B_1,A_2,\ldots,A_n) + \det (B_1,A_2,\ldots,A_n) \\ &=& \sum_{j=1}^n \det (B_1,\ldots, B_{j-1}, A_j -B_j,A_{j+1},\ldots,A_n) \\ && + \det B, \end{eqnarray*} Thus by Hadamard's inequality, \begin{eqnarray*} \det A -\det B &\leq& \sum_{j=1}^n \prod_{i=1}^{j-1} \|B_i\|_{2}\prod_{i=j+1}^{n} \|A_i\|_{2} \|A_j-B_j\|_{2} \\ &\leq& \|A-B\|_{2} \sum_{j=1}^n \|B\|^{j-1}_{2}\|A\|^{n-j}_{2} \\ &=& \|A-B\|_{2} \frac{\|A\|_{2}^n -\|B\|_{2}^n}{\|A\|_2 -\|B\|_2}. \end{eqnarray*}
The second bound is just that $x^n -y^n\leq n \max(|x|^{n-1},|y|^{n-1}) |x-y|$ and $\|A\|_2 \leq n\|A\|_\infty$.
Another approach is calculus, namely, to write that $\det B - \det A = f(1)-f(0)$, with $f(t)=\det(A + t(B-A))$.
By the mean value theorem $f(1)-f(0)\leq \max |f^\prime (t)|$.
We can compute that $$f^\prime(t) = {\rm trace}\left({\rm Cofm}(A +t(B-A))(B-A)\right)$$ (if I did not mess up, using the formula for the differential of a determinant, where Cofm means the matrix of Cofactors).
Then, it should deliver something better, if there is a nice way to bound it. The simplest thing is to use Cauchy-Schwarz, namely $$ \left|{\rm trace}\left({\rm Cofm}(A +t(B-A))(B-A)\right)\right|\leq \|B-A\|_{2} \|{\rm Cofm}(A +t(B-A))\|_{2} $$ and then, by lack of a better idea, $$ \|{\rm Cofm}(A +t(B-A))\|_{2}\leq n \max_{ij} |{\rm Cof}_{i,j}(A +t(B-A))|, $$ and brutally, $|{\rm Cof}_{i,j}(A +t(B-A))|\leq ((n-1) \max(\|A\|_\infty,\|B\|_\infty))^{n-1}$ gives a slightly better constant, namely $$ n^2(n-1)^{n-1} <n^{n+1} $$ but that still seems a very rough way to bound a determinant, as it is never sharp, since to attain this bound all coefficients should be equal, and therefore the cofactor would be zero. They are of the same order in $n$, and I suspect this order is wrong.