There's an answer to your question, sort of, below. But first: Please don't do this! It's simply wrong, mathematically (unless you do it just right, as in the "answer" below - but doing that seems if anything more complicated than simply talking about limits before derivatives). We both know that for most of the students it's not going to matter much what you do - otoh for the students who are later going to learn some calculus from a mathematically correct perspepctive, you're giving them wrong notions that they're going to have to unlearn. Like "division by zero is not allowed, except when you're doing calculus".
Calculus was nonsense mathematically for a long time; coming up with a mathematically valid version was one of the great accomplishments of Cauchy, Weierstrass et al. Ignoring that and reverting to the bogus version seems very sad.
(Also, not that this is the important issue, but it amounts to feeding the trolls. There are crackpots and web sites out there that assert that calculus is simply wrong. Typically the reason given is an explanation of why setting $h=0$ is invalid. Of course that's not a refutation of actual calculus, because in actual calculus we do not set $h=0$. But...)
My point, in the unlikely event that it's not clear: Say $f(x)=x^2$. If $h\ne0$ then $$\frac{f(x+h)-f(x)}h=2x+h.\tag{i}$$But now setting $h=0$ and concluding $f'(x)=2x$ is wrong, because (i) is only true for $h\ne0$.
You wouldn't even have to mention the word "limit" to give an unobjectionable version of the argument. Say (i) holds for $h\ne0$, and deduce that if $h$ is very small then the left side of (i) is very close to $2x$.
If someone asks why you don't just say "set $h=0$" give him or her a gold star and explain why it's not quite correct to put it that way.
Now, about the question of for what functions the procedure works: Whether it works is really not a property of the function so much as a property of the formula used to represent the function. In an abstract sense it works for any differentiable function. Here's an easy lemma:
Lemma. The function $f$ is differentiable at $x$ if and only if there exists a function $g(h)$, defined near the origin and continuous at the origin, such that $(f(x+h)-f(x))/h=g(h)$ for small $h\ne0$; in this case $f'(x)=g(0)$.
So. That function $g(h)$ such that $f'(x)=g(0)$ always exists, the question is whether you can find a formula for $g(h)$, $h\ne0$, such that the formula defines a function continuous at the origin.
Not that I'm saying you should do things in class from that point of view - surely that introduces more baggage than just talking about limits. My point is just that in some sense "it always works". For example if $f(x)=\sin(x)$ and $x=0$ then $$g(h)=\begin{cases}\frac{\sin(h)}h,&(h\ne0),
\\1,&(h=0).\end{cases}$$Of course that doesn't actually help in finding the derivative, because there's the question of why setting $g(0)=1$ makes $g$ continuous at the origin.
But the function $g$ such that $f'(0)=g(0)$ is out there - the reason this doesn't "really work" is we can't write down a simple formula for $g(h)$ that's valid for $h\ne0$ and also for $h=0$.