In some sense, what mathematicians have decided to call "the real numbers" is the biggest bunch of numbers that you can do arithmetic with (including divide by nonzero numbers) in a way that works nicely with inequalities (so non-real complex numbers like $i$ aren't allowed), that does not have anything that's "infinitely big" (absolute value greater than any integer).
If $h$ is "infinitely small" (absolute value less than $1/n$ for any positive integer $n$) but not zero, then $1/h$ would be infinitely big, and that wouldn't be allowed for a real number. Most calculus books stick to the real numbers, and so you can't consider $h$ "infinitely small" in a book like Spivak's Calculus.
But even if you go to a nonstandard book like Elementary Calculus: An Infinitesimal Approach, and allow/define nonreal numbers that can be "infinitely small", you still can't just say the derivative is "the value of the quotient when $h$ is infinitesimally small". It depends on the value of $h$. For example, if $f(x)=x^2$ then the quotient is $2x+h$, which has different values for different values of $h$.
So some special care must be taken with limits/derivatives whether or not you stick to the real numbers. Since the vast majority of calculus/analysis texts will be sticking to the real numbers (and for good reasons), I recommend you first learn with something like Spivak's Calculus, and then a more advanced treatment of analysis that, say, constructs the real numbers. And then you will be better prepared for reading about the unexciting reality of rephrasing the same arguments in terms of nonreal numbers that are "infinitely small", etc.