2

How does one easily determine if an algorithm has an exponential time complexity? The Word Break naive solution is known to have a time complexity of O(2n) but most people think its O(n!) because of the decreasing subsets as you go down the stack depth.

Word Break Problem: Given an input string and a dictionary of words, find out if the input string can be segmented into a space-separated sequence of dictionary words.

The brute-force/naive solution to this is:

def wordBreak(self, s, wordDict):
    queue = [0]
    dictionary_set = set(wordDict)
    for left in queue:
        for right in range(left,len(s)):
            if s[left:right+1] in dictionary_set:
                if right == len(s)-1:
                    return True
                queue.append(right+1)
    return False

With the worst case inputs of:

s = 'aab'
wordDict = ['a','aa','aaa']

Results in a O(2n) time complexity of 23 ≈ 7:

example

Let me caveat this question with:

  1. I'm not looking for a performant solution so just ignore why the implementation is the way it is. There is a known O(n2) dynamic programming solution.
  2. I know there are similar questions asked on stackexchange, but none of them gave an easy to understand answer to this problem. I would like a clear answer with examples using preferably the Word Break example.
  3. I can definitely figure the complexity by counting every computation in the code, or using recurrence relation substitution/tree methods but what I'm aiming for in this question is how do you know when you look at this algorithm quickly that it has a O(2n) time complexity? An approximation heuristic similar to looking at nested for loops and knowing that it has a O(n2) time complexity. Is there some mathematical theory/pattern here with decreasing subsets that easily answers this?
Raphael
  • 72,336
  • 29
  • 179
  • 389
  • 2
    "how do you know when you look at this algorithm quickly?" - Often you don't; it may take an involved calculation. Sometimes with practice you can get quicker at it. I'm not sure you're going to get a simple answer. Have you tried applying the standard techniques for analyzing running time to see what happens? – D.W. Sep 27 '17 at 01:47
  • "easily", "heuristically" -- what do these words mean to you, exactly? – Raphael Sep 28 '17 at 16:51
  • 1
    There are two basic ways: formal analysis and experiments. Automated methods are doomed to be incomplete. – Raphael Sep 28 '17 at 16:54
  • @Raphael, in that way experiments are also doomed, since you can (hypothetically) program a TM that will do it for you. – rus9384 Sep 28 '17 at 18:33
  • Proofs of worst case complexity oftenly are not easy. For example, exponential lower bound for simplex method has been proven long after it's introduction. – rus9384 Sep 28 '17 at 18:38
  • @rus9384 Yes, experiments can never prove an asymptotic running-time bound. They can, however, verify guesses and/or check behaviour for inputs of interest. – Raphael Sep 29 '17 at 04:49
  • A quick heuristic: if the algorithm is correct and it solves an NP-complete problem then it is surely not a polynomial time algorithm (or at least it's very very unlikely that it runs in polynomial time :-). – Vor Sep 29 '17 at 07:17
  • @Raphael What I mean by "easily", and "heuristically" could have been phrased better as "an educated guess", "an intuitive judgment", or "a mental shortcut". An example of this is when looking at single loop that goes through n items, and seeing that there are 5 constant operations inside the loop. You can easily derive that it has a O(n) time complexity. – perseverance Sep 29 '17 at 19:24
  • 1
    @perseverance That's called (accurate) intuition. The only way I know to gain it is through working rigorously analyses for as long as it takes. (Some never get there.) – Raphael Sep 29 '17 at 20:05

0 Answers0