In my classes, the derivative is usually defined as "the limit of the fractional incremental ratio". But I found out another way to define the derivative from an old book from Gardner & Thompson's "Calculus Made Easy". For example, if we have $f(x) = x^2$ and we want to calculate the derivative:
$$ f(x) = x^2 $$
So our y is
$$ y = x^2 $$
Right now we are only considering the increment of y and x so we can calculate that this way:
$$ y+dy = (x+dx)^2 $$ $$ y+dy = x^2+dx^2+2xdx $$
I can remove $$ dx^2 $$ because it's a very small quantity related to our magnitude.
The result is
$$ y+dy = x^2+2xdx $$
I subtract the original quantity $$ y+dy-y=x^2+2xdx-x^2 $$ $$ dy=2xdx $$ $$ dy/dx=2x $$
The derivative is equal to $$2x$$ and I calculate that without using any limits. So, my question is: the derivative is a real limit? What about the orders of magnitude? A limit is the representation of a value related to our magnitude?