1

On the question Big O Notation "is element of" or "is equal" it is said that

Example: you have two functions $n↦f(n)=n^3$ and $n↦g(n)=n^2$

Obviously f is asymptotically faster than g. [...] $f(n)∈O(g(n))$

Why is it "faster" and not "slower"?

I am kind of familiar to the Big $O$ notation for algorithms, but not for functions, and I am used to say that $O(n^3)$ is slower than $O(n^2)$, asymptotically.

Note: I believe that the OP said "slower", but somebody edited it to "faster".

Xaphanius
  • 155
  • Function $n^3$ grows faster than function $n^2$. An algorithm that runs in $O(n^3)$ is slower than one that runs in $O(n^2)$ because its runtime grows faster with $n$. It's all a matter of perspective. – Fabio Somenzi Dec 21 '16 at 18:57
  • Because if $n$ is very big, $n^{3}$ will be very large compared to $n^{2}$ and the bigger the $n$, the bigger the gap between $n^{3}$ and $n^{2}$. So, asymptotically (that is, when $n$ is very big), $f$ grows faster. We can also consider it in looking for $f(n)/g(n)$ when $n$ gets bigger. You see that this quotient is $n$ and thus increases fastly when $n$ increases. – MoebiusCorzer Dec 21 '16 at 18:58
  • @FabioSomenzi and MoebiusCorzer thank you for the replies. Here we talk about the growth of a function instead the "cost of the algorithm". – Xaphanius Dec 21 '16 at 19:07

0 Answers0