-2

Often when I'm doing an operation comparing an array to itself, I write code along these lines:

function operation (array) {
  for (let i = 0; i < array.length; i++) {
    for (let j = i + 1; j < array.length; j++) {
      // do something with array, i, and j
    }
  }
}

While the runtime of this code (starting j at i + 1) is clearly faster than a normal double-nested loop, I'm struggling to figure out if that's just a matter of a smaller coefficient or if it should be represented by a different big O.

Is there a better big O for the above function than n squared?

Robin
  • 107
  • @gnat I'm checking that out right now to see if it touches on this type of case – Robin Aug 07 '20 at 19:26
  • @JohnWu hmm, I'm familiar with O(n log n) in the context of sorting algorithms, I suspected that might be the complexity of this function... – Robin Aug 07 '20 at 19:28
  • 1
    It doesn't make sense to talk about algorithmic complexity without a cost model and a machine model. Or, in other words: it doesn't make sense to count things if you don't define what the things are that you are counting. – Jörg W Mittag Aug 07 '20 at 20:14
  • 2
    @JörgWMittag: you're putting the bar unnecessarily high. Whatever OP is doing is going to be at least proportional to the number of loop iterations. Possibly more, but it still makes sense to talk about that first. – Michael Borgwardt Aug 07 '20 at 20:33
  • @MichaelBorgwardt: Depending on the machine model, array.length might have a worst-case step complexity of Θ(#elements) pointer dereferences, for example. So, if we are counting pointer dereferences, and are in such a machine model, then the whole thing is at least O(n³). – Jörg W Mittag Aug 07 '20 at 21:01
  • 1
    @JohnWu No, it is still n^2. It only becomes n log n when you recurse on a fraction and not a subtraction. – Thorbjørn Ravn Andersen Aug 08 '20 at 14:41
  • 1
    You are showing nothing in the loop that depends on or takes advantage of being sorted. It is just ordinary loop nesting: n * n/2 ==> O(n^2). – Erik Eidt Aug 08 '20 at 15:20

2 Answers2

8

It's still O(n^2), with a constant coefficient of 1/2.

The number of times the inner loop is executed is (i-1) + (i-2) + (i-3) ... + 3 + 2 + 1 with the total number of terms being i. Note that the first and last term add up to i, as do the second and second-to-last, etc. So there are i/2 pairs, each of which adds up to i - which makes a total of i/2 * i = 1/2 * i^2

This is actually the idea famously used by an elementary school aged Carl Friedrich Gauss circa 1785 to outwit his teacher on a make-work assignment.

  • Your first-term-plus-last-term line of thinking makes the 0.5 coefficient obvious, this is great – Robin Aug 07 '20 at 19:58
  • I don't know software.SE as well as I do SO and I suspect I'm off topic here; do you suggest I delete my own question? – Robin Aug 07 '20 at 20:00
  • @RobbieWxyz: Don't worry about the question. And the problem is actually a very famous one, see my edit. – Michael Borgwardt Aug 07 '20 at 20:43
0

Of course, if you know that the data is sorted, you would most certainly use a more intelligent algorithm, such as "binary search."

If you "brute-force loop through the array, without any consideration of its known properties," then of course you richly deserve your fate.