1

I'm trying to determine the big O time complexity of the following data set where the first column is the input size, and the second column is the execution time in seconds. Where possible, I should determine the exponent of the dominant term and provide an estimate for the coefficient on the dominant term.


I made the following graph of the data and a log/log graph to try and figure it out. At first I thought it might have been 1/n or something along those lines but as n gets quite large, the execution time increases again instead of asymptotically approaching a value.

The log graph having such a distinct pattern leads me to believe that the answer will lie there but I'm stuck as to where to go from here. Any help would be great.

felicef885
  • 11
  • 1
  • Hi, welcome to CS.SE :) Some notes: (Asymptotic) running time is a property of an algorithm, not a "data set". Also, can you explain why the execution time decreases with growing input size? This is very uncommon. A bit more information would probably help a lot here. – Watercrystal Sep 30 '20 at 00:48
  • With existing information looks like it is more question of https://en.wikipedia.org/wiki/Approximation_theory – zkutch Sep 30 '20 at 00:54
  • Hi, interesting question. I'm just curious, why input size 1 has the largest execution time in your dataset? – kate Sep 30 '20 at 02:28

1 Answers1

1

You can't. You can't infer big-O complexity from a finite set of data points, as big-O complexity is about asymptotics. See, e.g., How to fool the plot inspection heuristic?.

D.W.
  • 159,275
  • 20
  • 227
  • 470