According to user137731 in the thread linked below, we define "little oh" as follows:
Definition: A function $f$ is called little oh of $g$ as $x\to a$, denoted $f\in o(g)$ as $x\to a$, if
$$\lim_{x\to a}\frac {f(x)}{g(x)}=0$$
Intuitively this means that $f(x)\to 0$ as $x\to a$ "faster" than $g$ does.
Personally, I find the intuition behind "little oh" more difficult than the straightforward definition of it. So I need intuition for the intuition given above. Here below is my attempt at understanding it. Does that make sense?
By definition, $\forall \epsilon > 0 \ \exists \delta \ |x - a| < \delta \implies \left|\frac{f(x)}{g(x)}\right| < \epsilon $ meaning for $x \in N_\delta(a)$, we have $\left|\frac{f(x)}{g(x)}\right| = 0$. But that only happens if $f(x) = 0, \ g(x) \neq 0.$ Is it the sense in which $f(x)$ is faster than $g(x)$ around $0$?
How is the derivative truly, literally the "best linear approximation" near a point?