13

Below, assume we're working with an infinite-tape Turing machine.

When explaining the notion of time complexity to someone, and why it is measured relative to the input size of an instance, I stumbled across the following claim:

[..] For example, it's natural that you'd need more steps to multiply two integers with 100000 bits, than, say multiplying two integers with 3 bits.

The claim is convincing, but somehow hand-waving. In all algorithms I came across, the larger the input size, the more steps you need. In more precise words, the time complexity is a monotonically increasing function of the input size.

Is it the case that time complexity is always an increasing function in the input size? If so, why is it the case? Is there a proof for that beyond hand-waving?

Kaveh
  • 22,231
  • 4
  • 51
  • 111
  • "Directly proportional" has a specific mathematical meaning that means, essentially linear time. In other words, if your input has size $n$, if the time is directly proportional the algorithm runs in time $cn$. I'd imagine that's not what you mean, as many algorithms do not run in linear time, i.e. sorting. Can you explain further? – SamM Aug 13 '12 at 23:12
  • So you're asking about an algorithm that runs in $o(1)$ time? $O(1)$ means the algorithm runs in the same time regardless of input size, $o(1)$ means it runs faster as the input gets larger.

    I can't think of one that runs in that time off the top of my head, but the notation is fairly common because an algorithm will often run in something like $O(n^2) + o(1)$ time--in other words, it takes $O(n^2)$ time, and there are some other terms that grow smaller as the input gets larger.

    – SamM Aug 13 '12 at 23:28
  • Good question. What about the counter-example of computing the prime factors of $c / n$ for some large $c$ (this is only an increasing function for $n \geq c$)? @Sam Note that an increasing function says that the time must be decreasing at some point along the real line (i.e. $f(b) < f(a), a < b$). – Casey Kuball Aug 13 '12 at 23:32
  • @Darthfett I'm afraid I don't follow. Not all increasing functions are decreasing at some point along the real line. – SamM Aug 13 '12 at 23:42
  • @Jennifer Yes, I understand, that makes sense. I'd recommend using the term $o(1)$ as it has the meaning you're looking for. And I'd like to reemphasize that direct proportionality implies linearity; I see what you're getting at but it may be confusing to those who are reading the question for the first time. – SamM Aug 13 '12 at 23:43
  • You could trivially define an algorithm that does not have non-decreasing worst case running time. – Andrew Aug 14 '12 at 01:21
  • @JenniferDylan Since $n^{-k} \rightarrow 0$ as $n \rightarrow \infty$, an algorithm with worst-case running time $O(n^{-k})$ would asymptotically take $0$ time. – Andrew Aug 14 '12 at 01:25
  • @AndrewMacFie, yes, assuming k>0. But do you have an example of such an algorithm? – dainichi Aug 14 '12 at 10:00
  • @Sam That came out wrong, I only meant that O(1) would not work, as for a function not to be an increasing function, it must be decreasing at some point ($f(x) = 1$ is still an increasing function). – Casey Kuball Aug 14 '12 at 14:06
  • @dainichi the algorithm that halts immediately on all input – Andrew Aug 14 '12 at 15:42

3 Answers3

13

Is it the case that time complexity is always an increasing function in the input size? If so, why is it the case?

No. Consider a Turing machine that halts after $n$ steps when the input size $n$ is even, and halts after $n^2$ steps when $n$ is odd.

If you mean the complexity of a problem, the answer is still no. The complexity of primality testing is much smaller for even numbers than for odd numbers.

JeffE
  • 8,703
  • 1
  • 36
  • 47
4

Is it the case that time complexity is always an increasing function in the input size? If so, why is it the case? Is there a proof for that beyond hand-waving?

Let $n$ denote the input size. To read the entire input, a turing machine already needs $n$ steps. So if you assume that an algorithm has to read it's entire input (or $n/c$ for some constant $c$), you will always end up with at least linear run time.


The problem with defining algorithms with a "monotonically decreasing run time function" is, that you have to define the run time for $n = 1$ somehow. You have to set it to some finite value. But there are infinite possible values for $n > 1$, so you end up with a function which is constant for infinite many values.


Probably sublinear algorithms are of interest for you, which do not read the entire input. See for example http://www.dcs.warwick.ac.uk/~czumaj/PUBLICATIONS/DRAFTS/Sublinear-time-Survey-BEATCS.pdf.

Christopher
  • 197
  • 5
  • There exist sublinear algorithms. For example, see http://people.csail.mit.edu/ronitt/sublinear.html. It's a reasonably new field but it's very interesting.

    There are other counterexamples to this. Finding an element given a sorted list takes $O(\log n)$ time in the RAM model.

    I agree with the idea behind your post. It doesn't make sense to have an algorithm take less time as the input gets larger because it doesn't have time to read all of the input (how does it know to take less time?). But I don't know how to prove that they don't exist, and that a trick couldn't make it $o(1)$.

    – SamM Aug 14 '12 at 02:00
  • @Sam: Sorry, I did not see your comment before my edit (adding sublinear algorithms). – Christopher Aug 14 '12 at 02:06
  • quite the opposite; I didn't see your edit before adding my comment. I would delete it but the second half still applies and an additional link can't hurt the OP – SamM Aug 14 '12 at 02:15
  • 1
    a counterexample: a constant function like $f(x)=0$. What you describe works for functions that need to read their input. – Kaveh Aug 14 '12 at 02:31
1

The relation $(\mathbb{N},\leq)$ is well-founded, i.e. there are no infinite falling sequences in the natural numbers. Since (worst-case) runtime functions map to the naturals, all runtime functions therefore have to be in $\Omega(1)$, that is all runtime functions are (in the limit) non-decreasing.

That said, average runtimes can contain oscillating components, for example Mergesort.

Raphael
  • 72,336
  • 29
  • 179
  • 389