60

I'm trying to understand algorithm complexity, and a lot of algorithms are classified as polynomial. I couldn't find an exact definition anywhere. I assume it is the complexity that is not exponential.

Do linear/constant/quadratic complexities count as polynomial? An answer in simple English will be appreciated :)

Raphael
  • 72,336
  • 29
  • 179
  • 389
Oleksiy
  • 703
  • 1
  • 5
  • 6
  • 1
    Time that is bounded by a polynomial in $n$. An example is $n\log n$, bounded by $n^2$. A counterexample is $e^n$, which cannot be bounded by a polynomial. –  Apr 13 '22 at 20:13

3 Answers3

52

First, consider a Turing machine as a model (you can use other models too as long as they are Turing equivalent) of the algorithm at hand. When you provide an input of size $n$, then you can think of the computation as a sequence of the machine's configuration after each step, i.e., $c_0, c_1, \ldots$ . Hopefully, the computation is finite, so there is some $t$ such $c_0, c_1, \ldots, c_t$. Then $t$ is the running time of the given algorithm for an input of size $n$.

An algorithm is polynomial (has polynomial running time) if for some $k,C>0$, its running time on inputs of size $n$ is at most $Cn^k$. Equivalently, an algorithm is polynomial if for some $k>0$, its running time on inputs of size $n$ is $O(n^k)$. This includes linear, quadratic, cubic and more. On the other hand, algorithms with exponential running times are not polynomial.

There are things in between - for example, the best known algorithm for factoring runs in time $O(\exp(Cn^{1/3} \log^{2/3} n))$ for some constant $C > 0$; such a running time is known as sub-exponential. Other algorithms could run in time $O(\exp(A\log^C n))$ for some $A > 0$ and $C > 1$, and these are known as quasi-polynomial. Such an algorithm has very recently been claimed for discrete log over small characteristics.

Yuval Filmus
  • 276,994
  • 27
  • 311
  • 503
9

Running an algorithm can take up some computing time. It mainly depends on how complex the algorithm is. Computer scientists have made a way to classify the algorithm based on its behaviour of how many operations it needs to perform (more ops take up more time).

One of that class shows polynomial time complexity. Ie., operational complexity is proportional to $n^c$ while n is size of input and c is some constant. Obviously the name comes because of $n^c$ which is a polynomial.

There are other 'types' of algorithms that take up constant time irrespective of the size of the input. Some take up $2^n$ time (yes, really slllooooww most of the time).

I just over simplified it for the layman and may have introduced errors. So read more https://stackoverflow.com/questions/4317414/polynomial-time-and-exponential-time

mixdev
  • 191
  • 1
  • 2
-3

In layman terms it the running time of your algorithm.

The order of algorithms (growth) can be in Big-oh (O), little-oh(o), omega (Ω) or theta(Θ).

If you are having problems calculating RR please view some questions i asked before and vote if you understand.

Say you have a for loop:

 for(i=1 to n)
     x++

The order or time complexity of this piece of code is: O(n)

Why big-oh? Because we want the worst case at which this piece of code runs.

Read here (these define the complexity of an algorithm and informs you of how algorithms are done in polynomial time):

 http://en.wikipedia.org/wiki/NP_(complexity)

 http://en.wikipedia.org/wiki/NP-complete

 http://en.wikipedia.org/wiki/NP-hard

Summary:

http://www.multiwingspan.co.uk/a23.php?page=types

Raidenlee
  • 5
  • 3
  • 2
    This doesn't exactly answer the question: polynomial time is not "the running time of your algorithm". Instead, the runtime of an algorithm can be polynomial, and so on. You could make the answer better by making it more precise. For example, do we really need to read through 3 Wikipedia articles about something, or do we actually even need to know anything about Big Oh? – Juho Aug 07 '13 at 15:16