1

Two algorithms to solve a particular problem can have theur efficiency compared using the $O$ and $o$ notation. However, this is very crude method, and tells us no information on how more effective one is than the other.

Is there a yard stick that can be applied to ALL algorithms, with ALL time complexities, that can be used to evaluate the efficiency of two algorithms?

I actually developed a method for this, but it applies only to polynomial algorithms.

MY METHOD.

Suppose we want to compare two algorithms for accomplishing a task $f_i(n), g_i(n)$. $f_i(n) = o\left(g_i(n)\right)$
A simple way to compare $f_i$ and $g_i$, is to find their ratio $r_i(n)$
$$r_i(n) = \frac{g_i(n)}{f_i(n))}$$
Express $r_i(n)$ as a polynomial of ,$n$.
$$r_i(n) = n^{k}, \{k: k \in \Bbb R\}$$
$$m = \lceil{k}\rceil$$

$R_i(n) = m^{th}$ derivative of $r_i(n)$.
It follows that $R(n)$ is a constant.

My method works nicely for polynomial algorithms, but is completely useless for other time complexities. Is there a more effective yardstick? One that applies to all complexities?

David Richerby
  • 81,689
  • 26
  • 141
  • 235
Tobi Alafin
  • 1,617
  • 3
  • 16
  • 22
  • 3
  • What makes you think using asymptotic running time gives "no information on how more effective one is than the other"? It tells you which one (if any) has better asymptotic worst-case running time, ignoring constant factors. 2. Are you trying to ask, how can we compare the running time of algorithms while paying attention to constant factors (i.e., without ignoring constant factors)? Or are you asking about asymptotic running time (ignoring constant factors)? I noticed you tagged your question [tag:asymptotics] but I'm not sure how much to read into that.
  • – D.W. Dec 07 '16 at 23:21
  • 2
    Please don't use spoiler to show a block of text with LaTeX in the question because it gets unreadable. The Landau notation is used for asymptotic behaviour. From the mere counting of "ALL" in your question the answer will be no. Also you are trying to compare efficiency, this requires additional considerations like cache, architecture etc. If you manage to show some analysis that suit your needs for polynomials, a fine grained method then expressing large complexities (more than doubly exponential, function like Ackermann) will be cumbersome. I do not understand your goal. – Evil Dec 07 '16 at 23:21
  • A function that can compare any two algorithms whether they have the same time complexity or not, and produce a 'score' describing the effectiveness of one algorithm over the other. Like what I did with my polynomial method. Only applicable to all time complexities. A yardstick. – Tobi Alafin Dec 07 '16 at 23:26
  • @D.W. $o$ notation may tell me $f(x)$ is more efficient than $g(x)$, but how much MORE efficient? I don't think I can get that from it. I'm ignoring constant factors. Asymptotic running time. When I mean efficiency, I'm talking purely about asymptotic efficiency. Worst case performance using RAM model of computation and the $O$ notation. Implementation isn't considered. I want a method to assign scalable consistent scores to an algorithm that operates in inverse ackermann time, and one that operates in ackermann time. – Tobi Alafin Dec 07 '16 at 23:34
  • @Evil above, basically a method like mine, but one that applies to all time complexities. Basically a scorimg system for algorithms. – Tobi Alafin Dec 07 '16 at 23:35