3

I'm currently reading Introduction to the Theory of Computation by Michael Sipser and in his section on time complexity, he tries to justify why theoretical computer scientists divide problems into P and NP.

First he says:

$P$ roughly corresponds to the class of problems that are realistically solvable on a computer.

He then continues a little further down the page:

Of course, a running time of $n^{100}$ is unlikely to be of any practical use. Nevertheless, calling polynomial time the threshold of practical solvability has proven to be useful. Once a polynomial time algorithm has been found for a problem that formerly appeared to require exponential time, some key insight into it has been gained and further reductions in its complexity usually follow, often to the point of actual practical utility.

So my question is the following: What are examples of problems that were proven to be in P but were intractable, but after some theoretical advances became tractable?

I'm also welcome to hearing about problems that were proven to be in P whose runtime was significantly advanced by some algorithmic improvements but perhaps was not advanced enough to be tractable.

user35734
  • 133
  • 3
  • 1
    Duplicate? This one is also related. Bottom line: the first statement by Sipser is probably plain wrong. – Raphael Nov 01 '15 at 18:53
  • 2
    I disagree with Sisper. It's a false belief. Indeed, the time hierarchy theorem ensures that there are problems in P that can't be solved in say $O(n^{100})$. – Yuval Filmus Nov 01 '15 at 19:43
  • 1
    I'm not sure it's worth arguing about what's obviously intended to be an informal, introductory statement that includes the word "roughly". – David Richerby Nov 01 '15 at 21:18
  • I'm not really arguing with Sipser. I'm just looking for examples of problems of the type he describes in the second excerpt. – user35734 Nov 01 '15 at 22:30
  • Is that not what I did a few hours ago? I have one question (the second one is only really a variant on that one) and it doesn't have to do with Raphael's links. Should I remove some of my quotes? – user35734 Nov 02 '15 at 00:22
  • 1
  • An additional concern with this question is that it is a "list question": it asks for a list of problems that satisfy some conditions. These questions tend not to work so well on this site's format because they tend to accumulate multiple answers, all equally valid, and there's no clear way to pick a single objectively correct answer -- so it just degenerates to an opinion poll. See, e.g., http://meta.cs.stackexchange.com/q/20/755, http://meta.cs.stackexchange.com/q/487/755. – D.W. Nov 02 '15 at 16:57
  • Anyway, deterministic primality testing (such as the AKS test) is another example of your second situation: the algorithm's running time was initially $O(n^{12})$, but was subsequently reduced to $O(n^6)$. I'm not posting this as an answer because I don't want to contribute to the problems outlined in my previous comment. – D.W. Nov 02 '15 at 16:59

1 Answers1

2

One example in which the running time became significantly better but still not necessarily practical is submodular function minimization, which went down from $O(n^7)$ (Grötschel, Lovász and Schrijver, 1981) to $\tilde{O}(n^4)$ (Lee, Sidford and Wong, 2015). The latter paper (and the ones preceding it) has many such results.

Yuval Filmus
  • 276,994
  • 27
  • 311
  • 503