-3

Say I have some problem of $O\left(n^k\right)$ complexity.

If I were to solve the problem on a computer $x$, it would take time $t$.

Now I have a new computer $x'$, which has double the computing power of $x$.

How long would it take $x'$ to solve the same problem in terms of $t$?

Raphael
  • 72,336
  • 29
  • 179
  • 389
JibbyJames
  • 13
  • 4
  • What is your definition of polynomial time, $n$, $k$, etc.? One definition is the number of tape head movements for a single-tape, deterministic Turing Machine (which is one definition of the class $\mathcal{P}$). – Ryan Dougherty Mar 19 '15 at 01:12
  • @Ryan I did not realise there were multiple definitions of polynomial time. I would assume it's the one defined for class P. Thanks. – JibbyJames Mar 19 '15 at 01:16
  • @Ryan A polynomial number of operations of machine $x$. It doesn't really matter what those operations are, assuming that $x'$ has the same basic operations and does them twice as fast. – David Richerby Mar 19 '15 at 01:17
  • 2
    You want to use $\Theta$. With only $O$ you know nothing. Note furthermore that "computing power" as one dimensional parameter is not a real thing. (For example, if your CPU is twice as fast but memory is slower, most algorithms will not experience a speed-up by factor two, if any.) – Raphael Mar 19 '15 at 07:02
  • 2
    Imho it doesn't really matter what the complexity of the problem is. If a computer takes time t, another computer that's twice as fast takes time 0.5t. Isn't that obvious? – Albert Hendriks Mar 19 '15 at 13:33
  • @AlbertHendriks I assumed that would only be true for linear time problems. I thought it would be a far less significant difference for polynomial time, even less as $k$ increases, I'm just not sure of the correct formula to calculate this. – JibbyJames Mar 19 '15 at 16:55
  • What does "double computing power" mean? The discussion around this question has really surprised me. No matter what the asymptotic complexity of the algorithm used to solve a problem, there is a machine that solves the problem at hand in time $t$. Now, we have another machine that does every operation in half the time (if that is what double computing power means). So the second machine will finish in half the time. What have I missed? – megas Mar 19 '15 at 18:40

2 Answers2

3

Comment: This actually answers a much more interesting question:

Suppose algorithm $A$ runs in time $O(n^k)$ on inputs of length $n$. If we double the input length, how long would it take $A$ to run on the new input in terms of its running time on the original output?

The answer you were supposed to give is:

$2^kT$, where $k$ is the exponent in the running time $O(n^k)$.

The idea is that if your running time is exactly $T(n) = Cn^k$ then $$ T(2n) = C(2n)^k = 2^k(Cn^k) = 2^kT(n). $$

However, the running time need not be exactly $Cn^k$, and if it isn't, you can't really tell what the exact relation is between $T(2n)$ and $T(n)$. You can come up with pathological examples like $$ T(n) = \begin{cases} 1 & \text{if $n$ is odd}, \\ n & \text{if $n$ is even}. \end{cases} $$ If $n$ is odd then $T(2n) = 2nT(n)$, and the factor $2n$ isn't bounded; yet $T(n) = O(n)$ has polynomial growth.

If $T(n) = \Theta(n^k)$ then we can say that $T(2n) = \Theta(T(n))$, but it could still be that, say, $T(2n) < T(n)$, as in this examples: $$ T(n) = \begin{cases} 100n & \text{if $n$ is odd}, \\ n & \text{if $n$ is even}. \end{cases} $$ If $n$ is odd then $T(2n) = T(n)/50$.

Yuval Filmus
  • 276,994
  • 27
  • 311
  • 503
  • I don't see why we increase instance sizes; if we solve the same problem at twice the speed, it takes us half as long. Granted, that's a really boring question, but I actually don't see how the original question is related to complexity. – G. Bach Mar 20 '15 at 01:02
  • @G.Bach I might have misread the question. Usually these questions ask for what my answer refers to. Otherwise, as you say, the nature of the original running time doesn't come into play at all. – Yuval Filmus Mar 20 '15 at 01:04
  • 2
    I think what we have in mind is the question, "given twice the processing power, how much larger are the instances I can solve in fixed time $T$?". The question as stated seems to be trivial indeed. – Raphael Mar 20 '15 at 06:56
2

It will take half the time of course, $t'=t/2$, and the asymptotic complexity remains $O(n^k)$ !

  • I guess that the OP was expecting an answer like $t/\sqrt[k]2$, indeed irrelevant. –  Mar 20 '15 at 13:21
  • I think I should have asked what Yuval edited my question to be. – JibbyJames Mar 20 '15 at 13:56
  • I guess the question should rather have been: "if the computing power doubles, how many elements can be processed in the same time", the (empirical) answer being $\sqrt[k]2n$. –  Mar 20 '15 at 14:00
  • That is actually a far better formula for explaining what I am writing about. Thank you. – JibbyJames Mar 20 '15 at 14:27