1

I am carrying on war with Landau notation during the whole math study I have. We used that in different variations for many traditional purposes including algorithm runtimes.

But I still cannot get clear with this notation, and cannot answer questions like "Is this function a big O from another function?", "Is that true that being a small o implies a function to be a big O of another function?", "What complexity has this certain algorithm?", etc.

Every time I meet such questions I normally land on reading definitions on Wikipedia and scarce descriptions in the books I've read. And, I have no idea how to use these definitions for the answers I have to give. Also, in all the exercises where I have to found such O-s or prove if one function is an o of another I just do not know where to start and if my solutions are correct.

Do you know, or, could you advice me some certain material, so that I can get clear with these O-s for all times? I would need all the theory necessary for the notation, accompanied with exercises. After completing the exercises I want to be sure that my solutions are correct and I come clear with the usage.

Thank you.


Edit for Jose Brox:

Well, the confusion comes from not knowing how to use the definitions and can be a subject of a separate question.

Example 1: find functions $g$ and $h$ such that $f(x) = o(g(x))$ and $f(x) = O(h(x))$ for $x \to -\infty, f(x) = \frac{x^3 + x + 12}{x-4}$.

Step 1: question. When should I write $\in$, when $=$?

Step 2: Why we write $=$, if we mean $\in$?

Step 3: What should I write and when?

Step 4: definitions.

  1. If $f = O(g)$ is defined as $\limsup_{x \to a} \left\vert \frac{f(x)}{g(x)}\right\vert < \infty$, then I have to find a function $h$, such that $\displaystyle \limsup_{x \to - \infty}\left\vert \frac{x^3 + x + 12}{(x-4) \cdot h(x)}\right\vert <\infty. $

Step 4.1.1 Why an absolute value?

Step 4.1.2 The limit should exist and be finite. Why $\limsup$? What is the difference to $\lim$? Why not $\liminf$?

Step 4.1.3 Solution. Take 1. I have a function $fh(x) = \left\vert \frac{x^3 + x + 12}{(x-4) \cdot h(x)}\right\vert$ and have to find a sequence $(x_n)$ such that $fh((x)_n) \to y < \infty$ for $x \to -\infty$. Not sure I am right already. We used to use limes superior for sequences, I am not sure how to use this here. I have to find some $h$, such that $\limsup$ of the absolute value of some fraction that we used for sequences is finite. BrainOverHeatException, Compiler Error, System.exit(1).

Step 4.1.4 Solution. Take 2. Let's keep it simple. Let us pick just random $h$ and look what happens. $h = x^3$ looks good. Solution found. Have a learnt something? Nope, pure luck. What shall I do, if the solution will be not as obvious? No idea.

Step 4.1.5 Conclusion. No idea how it works and what the definition means. Ask me if there is a function that bounds $f$ better, I will stare at $\limsup_{x \to a} \left\vert \frac{f(x)}{g(x)}\right\vert < \infty$ and do not know what to say.

  1. If $f(x) = O(g)$ means $\exists C>0 \ \exists x_0 > 0 \ \forall x> x_0: \vert(f(x)\vert \le C \cdot \vert g(x)\vert$. Ok, I have to some $h$ such that $\exists C>0 \ \exists x_0 > 0 \ \forall x> x_0: \vert\frac{x^3 + x + 12}{x-4}\vert \le C \cdot \vert h(x)\vert.$

Step 4.2.1 I have to find a $h$ such that there exists a $C$. What?!

LogicOutOfBoundException, Compiler Error, System.exit(1).

Step 4.2.2 Ok, let's play an old game. For a random $h$ check if there is such a $C$. Hm, I thought I have to found $h$, not pick it up random. Ok, but then how can I tell if there is a $C$, if I do not know $h$?

terminated

Step 4.2.3 I have no idea if there is a $C$ or not. Even if there is, I have no idea if there is a $x_0$ in dependence of $C$. I have no definite answer how one should try to answer on existence question. Maybe there is justice in the world, but it depends on $h$ that I have to find...

Without describing further attempts I can just say - I have no idea how to use this series of quantifiers to solve my exercise. And, I see no certain way how to combine them.

In many kind cases definitions really define something. You read them, you understand what we talking about: is this here a normed space? Well, let's check the conditions. If "this here" satisfy all the conditions, it is a normed space.

In the case of Landau notation definitions usually starts with "We say the $f$ is a big O of $g$...". So we are really about saying something but mean something totally complete different.

Returning to the question you've asked: what is my confusion about these definitions? I do not understand them, I do not understand what they define, and I do not understand how to use them even for this certain exercise I've written here. For me at my grade the two definitions seem to be two different approaches to define something. Some people use quantifiers, some the limits. But I do not understand how they do that and what they mean.

Example 2 was actually thought to be an example, where I have to determine the complexity of some algorithm. That would be (in steps): 1) estimating the amount of steps, 2) approximating it with some "O". I have no idea how to find such O either. I would usually go back to definitions and start from the step 4 the analogous game...

All in all, the "O"-question becomes for me a non-NP problem, where I just waste time for finding a solution. There are many interesting things to do in math. Instead, I continue my war against some notations invented by somebody else long time ago.

...

  • 1
    The problem you are facing has only one solution. Practice. – 5xum Jan 22 '18 at 13:28
  • 1
    @5xum The OP seems to be asking precisely that: what is a good source to get the basics, several exercises, and guided solutions to them? – Jose Brox Jan 22 '18 at 13:31
  • 2
    Practice - yes, but one should know how to practice. Assume I want to become an actor. If one practices crook walk one hour a day, he will be a professional crook walker in a year. But it has nothing to do with a beautiful walk on the stage. The one should know what to practice and in what manner. @5xum – jupiter_jazz Jan 22 '18 at 13:33
  • 1
    I think you will find this question of mine helpful. It’s a big-$O$ cheat sheet. – gen-ℤ ready to perish Jan 22 '18 at 13:46
  • 1
    @ChaseRyanTaylor Well, thank you. A similar table is available on Wikipedia. But, as mentioned in the comments to your question, some notations are correct, some can be not - the limit doesn't exist permanently. Moreover, annotations to a table on Wikipedia differentiate between the number theory and complexity theory and introduce different definitions using quantifiers, confusing me personally a bit more. Looking at all this together I really do not know how to learn this - it seems that a have to use two different notations for $f = \frac{\sin x}{x}$ and the complexity of Greedy algorithm. – jupiter_jazz Jan 22 '18 at 14:02
  • @Kirill Can you elaborate on your confusion regarding different definitions? If there is some specific source of confusion, perhaps someone here may be able to resolve it. – Jose Brox Jan 22 '18 at 14:06
  • 1
    @JoseBrox I've added a big edit, proportionally to my confusion. The whole confusion is a mix of misunderstood, and a reason for question I've asked primary. – jupiter_jazz Jan 22 '18 at 15:35

2 Answers2

2

A.J. Hildebrand started some (free) lecture notes on the subject, but wrote only two chapters. I think the second one, "Asymptotic notations", may be close to what you are looking for. You can access it here:

https://faculty.math.illinois.edu/~hildebr/595ama/

I'd also have a look to the first one, just for completeness.

Jose Brox
  • 4,856
2

I'm tackling your specific doubts in a separate answer. I have the feeling that you have two distinct sources of confusion:

1) You don't understand the main underlying idea behind the use of these notations.

2) You are not acquainted with limits that well (what do they mean, how to compute them).

First I will address 1): Asymptotic calculus and notations are not thought to complicate matters, but to simplify them! (You should indeed read the first chapter of Hildebrand's notes!). We use them when we feel we have in our hands some "intractable" function, which grows awkwardly and we cannot control well. Then we "replace" it with some other function, better behaved, easier to understand, and related to the original one by some bound or growth relationship (as good as we can find); then we use the corresponding notation to indicate what that relation exactly is.

This lack of control happens more often than not when the source of the function is combinatorial, number-theoretic, or computational, and then the function we choose to bound it is "analytic". For example, knowing, for every $n$, exactly how many prime numbers there are below $n$ is hard, but we can prove (using some nontrivial mathematics) that this number, called $\pi(n)$, is "more or less" $n/\log(n)$ (here $\log$ stands for the natural logarithm). But what exactly do we mean by "more or less"? In this case, what we can prove is that the relative error of approximating $\pi(n)$ by $n/\log(n)$ gets smaller and smaller as $n$ grows, to the point of being $0$ in the limit; that is, $$\lim_{n\rightarrow\infty}\frac{\pi(n)-n/\log(n)}{\pi(n)}=0.$$

As you can imagine, this situation in which the relative error of two functions $f,g$ goes to $0$ as $n$ grows is pretty usual and quite useful, so we have a notation for it; we say $f\sim g$ (reads $f$ is asymptotically equivalent to $g$). We can prove that $f\sim g$ if and only if $$\lim_{n\rightarrow\infty}\frac{f}g=1$$ ($g$ should not annihilate infinitely often, etc.). Another formulation of this definition is that for every small quantity $e>0$ there is $N$ big enough such that if $n$ is greater than $N$ then $$(1-e)g(n)\leq f(n)\leq (1+e)g(n).$$ For example, there is an $N$ such that $$0.9(n/\log n)\leq\pi(n)\leq1.1(n/\log n)$$ for all $n\geq N.$

2) As you can see, a good understanding of limits is unavoidable in this subject. Let us tackle your first example: we have $f(x)$ a quotient of a third degree polynomial with a first degree one (so it "kind of" behaves as a second degree one). Let us find $h(x)$ such that $f=O(h)$ when $x$ tends to $-\infty$. What this means is that there exists a fixed positive constant $C$ such that $|f(x)|\leq C|h(x)|$ for all negative values of $x$ great enough in absolute value (greater than some $|x_0|$); for example, for $x\rightarrow\infty$ it is true that $1=O(x)$ (pick $C=1$ and $x_0=1$, for example) but also $1=O(x^2)$ (same constants), $1=O(x+1)$ ($C=1$, $x_0=0$), and even $1=O(\log x)$ ($C=1$, $x_0=e$). Be sure that you really understand these examples.

In first place, it is obvious that $f=O(f)$ with $C=1$ and any $x_0$, although that may not satisfy you (but in this case, $f$ is not that badly behaved!). We can get it simpler: If $x<0$, then the denominator satisfies $|x-4|>4$, so $|f(x)|<|x^3+x+12|$, hence $f=O(x^3+x+12)$ (with $C=1$, $x_0=0$, for example). We can get it simpler: from some $x_0$ onwards we have $|x^3+x+12|\leq|12x^3+36x^2+36x+12|=12|(x+1)^3|$, so $f=O((x+1)^3)$ (for example with $C=12$, $|x_0|$ big enough).

It is different if we want to get the "best simple" bound we can get. Then, as $x$ tends to $-\infty$, $f(x)$ behaves more and more like $x^2$, so we should be able to prove $f=O(x^2)$, and indeed we can: as an informal proof, plot the graphs of $f(x)$ and $x^2$ with your favourite program and see that $x^2$ dominates $f(x)$ to the left from some point onwards (so we can pick $C=1$).

Now we want $g$ such that $f=o(g)$ when $x\rightarrow -\infty$. What this means is that $$\lim_{x\rightarrow -\infty} \frac{f(x)}{g(x)}=0.$$ This is easy enough: since $f(x)$ behaves as $x^2$, pick something that grows sufficiently faster, for example $g(x)=x^3$. Therefore $f(x)=o(x^3)$. Let us prove it:

$$\lim_{x\rightarrow -\infty} \frac{(x^3+x+12)/(x-4)}{x^3} = \lim \frac{x^3+x+12}{x^3(x-4)}=\lim\frac{1+1/x^2+12/x^3}{x-4}=\lim\frac1x=0.$$

About the use of quantifiers: limits are defined in terms of them, so it is not strange that you can pick a definition involving a limit, play with it, and get an equivalent definition based on inequalities and absolute values.

About the use of limsup and lim: see O-Notation: Limsup vs. Lim . Intuitively, $f=O(g)$ means that $f$ grows slower than $g$, or comparatively equal to $g$, from a point onwards (take into account that $x=O(x^2)$, but also $2x^2=O(x^2)$!), while $f=o(g)$ means that $g$ grows certainly faster than $f$, from a point onwards.

About the use and misuse of $=$ and $\in$: you may be right, depending on the context. But in some cases you can replace membership in a set with equality with some equivalence class, via an equivalence relation (as with asymptotic equivalence, for example).

I think I will leave it here. Two final advices to mirror the beginning of this answer:

1) Keep in mind that we are seeking simple analytic functions that we can easily understand and manipulate to compare and bound our initial, difficult functions.

2) Get practice with limits and their definitions.

Jose Brox
  • 4,856