2

According to user137731 in the thread linked below, we define "little oh" as follows:

Definition: A function $f$ is called little oh of $g$ as $x\to a$, denoted $f\in o(g)$ as $x\to a$, if

$$\lim_{x\to a}\frac {f(x)}{g(x)}=0$$

Intuitively this means that $f(x)\to 0$ as $x\to a$ "faster" than $g$ does.


Personally, I find the intuition behind "little oh" more difficult than the straightforward definition of it. So I need intuition for the intuition given above. Here below is my attempt at understanding it. Does that make sense?

By definition, $\forall \epsilon > 0 \ \exists \delta \ |x - a| < \delta \implies \left|\frac{f(x)}{g(x)}\right| < \epsilon $ meaning for $x \in N_\delta(a)$, we have $\left|\frac{f(x)}{g(x)}\right| = 0$. But that only happens if $f(x) = 0, \ g(x) \neq 0.$ Is it the sense in which $f(x)$ is faster than $g(x)$ around $0$?


How is the derivative truly, literally the "best linear approximation" near a point?

  • 1
    You have confused the quantifiers; for every $\epsilon$ there exists a neighborhood where $|f/g|<\epsilon$ doesn't mean there is a single neighborhood where $|f/g|=0$. – Ian Oct 04 '21 at 21:15
  • 1
    It may help to look at an example in some detail: try to understand what it means when we say $x^2=o(x)$ as $x \to 0$. – Ian Oct 04 '21 at 21:15
  • 1
    $f(x)$ doesn't necessarily ever reach $0$. What is $N_\delta(a)$? Note that $\delta$ can depend on $\epsilon$. It seems like you've accidentally swapped the quantifiers. – Karl Oct 04 '21 at 21:17
  • 1
    The better intuition is that $f$ is significantly smaller than $g$ around $a$, since we don’t need to have $f(x)\to0$ at all. For example $x\in o(x^2)$ for $x\to \infty$ (i.e. $a=\infty$). – Milten Oct 04 '21 at 21:17
  • @Milten, so $x \in o(x^2)$ for $x \to \infty$ means $x < x^2$ for $x$ around $\infty$ and $x^2 \in o(x)$ for $x \to 0$ means $x^2 < x$ for $x \approx 0?$ – user974374 Oct 04 '21 at 23:01
  • @user974374 yes! Of course, $o()$ is stronger than just $<$, but that’s the intuition. – Milten Oct 05 '21 at 05:44

2 Answers2

1

Not exactly, the definition given is equivalent to the following

$$f(x)=\omega(x)g(x)$$

with $\omega(x) \to 0$ as $x \to a$ and therefore

$$\lim_{x\to a}\frac {f(x)}{g(x)}=\lim_{x\to a}\frac {\omega(x)g(x)}{g(x)}=\lim_{x\to a}\omega(x)=0$$

For example we have that

  • $x^2=o(x)$ as $x \to 0$ since $x^2 = x \cdot x$
  • $x=o(1)$ as $x \to 0$ since $x = x \cdot 1$
  • $x\log x=o(\log x)$ as $x \to 0^+$ since $x\log x = x \cdot \log x$
user
  • 154,566
1

The quoted "intuition" is completely wrong. Take f(x) = x^2 |sin x| and g(x) = x^3 |sin x|. Both swing wilder and wilder but often get close to 0, as x -> infinity. f(x) -> 0 is most definitely not true. f(x) -> anything isn't true either.

Yet the ratio f(x) / g(x) = 1/x becomes smaller and smaller, so f(x) = o(g(x)).

gnasher729
  • 10,113