6

P.S. I have added the tag 'history', if there is any historical connotation.

Also, I found this question What is running time of an algorithm? but I am not satisfied with answers.

Ubi.B
  • 220
  • 2
  • 16
  • 2
    I'm not sure if this question can be answered objectively. I'd guess that the reference to time can be explained by the fact that we'd like to know the precise time, but unfortunately cannot reason with it properly, so we simply take the best predictor we know for execution time that can be reasoned with: the execution step count. But my guess is as good as any. I doubt this can be decisively answered unless a clear historical motivation is found. – Discrete lizard Jan 13 '18 at 12:39
  • That is true. So far, I understood that they (analysts) avoided unwanted complexity involved for calculating actually time of an algorithm (or to say program) to execute like internal process call etc., they figured dominating function. But, why call it time-complexity is creating lot of confusion in the mind of fresher in the field. – Ubi.B Jan 13 '18 at 12:58
  • 4
    You can also ask, why "complexity" when we're actually interested in "cost"? – Raphael Jan 13 '18 at 14:02
  • No, I am actually interested in terminology of 'Time Complexity'. Actually, I thought to tag you in post. Thanks you noticed – Ubi.B Jan 13 '18 at 14:30
  • We don't actually study program. Running actual program will involve lot of operations. I have explicitly mentioned below that part. We study algorithm which in fact is translated to mathematical functions. So, I don't see that 'steps' or 'operations' would be eligible term to use. In fact, we see that standard books later explains on same lines i.e. we are concern with steps involve. – Ubi.B Jan 13 '18 at 14:43
  • If I say it takes me one hour to drive to work, is that some how less of a statement of the time it takes than if I say it takes me 3643.9 seconds to drive there? For the purposes algorithmic analysis, using "step" as a vague but mostly constant unit of time is perfectly reasonable. You're being too precise about what you think "time" means here. – chepner Jan 13 '18 at 15:56
  • Analogy that you gave is irrelevant. Also, 'step' is not a vague word here. In fact, it is on point and pertinent to requirement of understanding the concept of analysis. I guess, you miss the point. please read it again. As, I am not trying to be precise, but trying to make it more clear that concept of 'time' is irrelevant here. – Ubi.B Jan 13 '18 at 16:06
  • 1
    @UKB I don't think anyone's claiming that "step" is vague. People are complaining that "step complexity" is vague because it could mean "complexity measured by number of steps" or "complexity of an individual steps" (as in, "there are only three steps in this algorithm, but one of them is very complex"). – David Richerby Jan 13 '18 at 16:20
  • So, do you have a problem that "Time Complexity" sounds like its more about the "Complexity of Time" than anything else? Like how "Material Density" is the "Density of Material"? ... 'Cause if so, I think I understand your problem? – Malady Jan 13 '18 at 23:50
  • Why did you delete almost all of your question? – Discrete lizard Jan 15 '18 at 09:01

5 Answers5

11

Perhaps the earliest place in which time complexity appears is On the computational complexity of algorithms by Hartmanis and Stearns. Their goal is to study computation complexity, which they define as follows:

The computational complexity of a sequence is to be measured by how fast a multitape Turing machine can print out the terms of the sequence.

Their first section, in which they prove (among else) a time-hierarchy theorem, is about "time-limited computations". They explicitly mention their concept of time:

The machine operation is our basic unit of time.

The reference is to a multitape Turing machine, which they diligently define.

The intention here is to model the running time of algorithms using abstract machines, using number of steps as a proxy for time. Anticipating your criticism, they mention:

Furthermore, the [complexity] classes are independent of time scale or of the speed of the components from which the machines could be built, as there is a "speed-up" theorem which states that $S_T = S_{kT}$ [i.e., $\mathsf{TIME}(T(n))$ = $\mathsf{TIME}(kT(n))$] for positive numbers $k$.

That is, multitape Turing machines can always be sped up by an arbitrary constant, and so there is no harm in associating number of steps with running time, since time complexity classes are "scale free".

Yuval Filmus
  • 276,994
  • 27
  • 311
  • 503
  • Nice! I really appreciate your answer. So, far your answer has some relevance. I was/is miffed by the confusion it creates for fresher or up to some extant intermediate level CS students. Literary, first impression is to find actual time. But again, I would like to say that I don't need physical machine to run algorithm. In fact, you can't run algorithm (pseudo-code) on physical machine until we translate into some programming language. We work on piece of paper and suppose, input = n (max). So, where is time involved?. Time comes in picture, if program is ran on machine. – Ubi.B Jan 13 '18 at 16:20
  • 2
    The program does run on a machine - an abstract machine like a RAM machine. The RAM machine is a good model for CPUs, and algorithms can typically be converted from a textual description based on the RAM machine model to C programs without loss of efficiency. – Yuval Filmus Jan 13 '18 at 16:35
  • Yes, I agree that we don't loose any efficiency, if translated to C or any other program. But don't you think there should be difference in view from Mathematical and CS point of view? Because, if I as CS student design algo to calculate factorial and show it to you. You will say it has complexity of n!, that's it. N! is mathematical function and I don't relation of time with these mathematical functions (n, $n^2, n!$ etc.,) – Ubi.B Jan 13 '18 at 16:48
  • 2
    Computer science is a mathematical area. I don't see any need to artificially distinguish between the two. It seems like you have some misconceptions you have to overcome. – Yuval Filmus Jan 13 '18 at 18:08
  • 3
    This discussion seems to be losing focus. You've made your talking point. Personally, I disagree with it, but ultimately it's a subjective matter. You're entitled to your opinion, but don't expect to convince me. – Yuval Filmus Jan 13 '18 at 18:17
  • @UKB In computer science, as in any other STE field, we create mathematical models of things in order to reason (more or less) rigorously about them. One key goal is to apply already known solutions or solution techniques to new problems. The two keywords here are abstraction and reduction, two key concepts of CS (imho). So yes, mathematics are important to CS. – Raphael Jan 14 '18 at 08:16
3

I think, though I don't have any references to back this up, that it's just a convenient name that has a ring of truth to it.

If you imagine implementing a standard Turing machine, it does seem reasonable that every step of your actual, physical machine will take the same amount of time. So, for a Turing machine, time and number of steps are the same thing, up to a constant factor. That isn't true for more complicated machines – for example, a single step of a RAM could involve moving the tape heads an arbitrary distance – but the analogy is good enough.

All names seem to have disadvantages. "Time complexity" sounds like it's measured in seconds. Something like "step complexity", "operation complexity" or "instruction complexity" might be misunderstood as referring to the complexity of the individual steps, rather than to complexity measured by the number of steps. But these aren't big disadvantages: it only takes a moment to explain that "time" doesn't literally mean time or that "step complexity" doesn't mean the complexity of the steps.

If we were starting again from scratch, I think either "time complexity" or "step complexity" would be a reasonable name. I can't think of any other term that's both reasonably short and more accurately conveys the concept of complexity measured in terms of the number of computational steps.

And be thankful that we didn't call it "type 1" complexity. I'm looking at you, statistics and diabetes.

David Richerby
  • 81,689
  • 26
  • 141
  • 235
  • I rather say that 'Time complexity' creates more confusing. I have seen people in real life and online (mainly fresher) asking for actual way of calculating time and other person providing answer by saying that keep a start timecounter and end timecounter and measure time (I'm not joking). Majority of standard text books talk about steps (operations) we are concern with and it became very clear when we understand that we are actually calculating steps of mathematical function involve. So, we don't have to worry about internal operation. Because we can workout on piece of paper. – Ubi.B Jan 13 '18 at 14:53
  • Typo "confusing" should be "confusion". Also, I am not hypnotizing anything. I am interested in Terminology rather than proving something. – Ubi.B Jan 13 '18 at 15:02
  • @UKB The reference to statistics and diabetes are pointing to some completely terrible naming conventions. Statisticians call false positives and false negatives type-1 and type-2 errors; clinicians call diabetes caused by losing the ability to produce insulin and caused by losing the ability to respond to insulin type-1 and type-2 diabetes. These are terrible names because the name does nothing at all to remind you of its meaning. At least "time complexity" and "space complexity" remind you what they mean when you say them. – David Richerby Jan 13 '18 at 15:10
  • I don't see same problem here. While studying time-complexity we emphasis more on mathematical functions and then we discard insignificant terms and stress more one steps or operation involve in most significant term. I don't relate to that thought – Ubi.B Jan 13 '18 at 15:21
  • The same problem as what? – David Richerby Jan 13 '18 at 16:20
  • So, er, would any of the downvoters care to explain. Looks like an OK answer to me -- sure, it's not screaming "bow down before my awesomeness and upvote me!" but three downvotes? – David Richerby Jan 13 '18 at 20:43
2

Time complexity is a formal model (an abstraction) of program running time. Although on the face of it you are right that it really measures the number of steps, it is asymptotically no different from the actual running time of the machine (Turing machine or any other model of computation). Therefore I disagree that there is any problem with the terminology.

Think about it from the programmer's perspective. When you write a piece of code, say

 for i = 1 ... n :
     for j = 1 ... i :
          print j
     print newline

you can't (as a programmer) actually predict how long the program will take to run in seconds, with accuracy. Moreover the number of seconds depends on the exact platform on which you run the code, level of parallelization, what file or output you are printing to, etc.

But what you can measure is the number of steps your code runs -- that is, its time complexity, as a function of $n$. You simply count the number of times a print statement is executed. This is -- up to a constant -- a good and correct estimate of the actual time the program will take to run, in seconds.

In summary, the concept of time complexity is exactly the same concept that programmers use to think about their code's performance, and the abstraction is the same as the actual running time up to a constant.

Caleb Stanford
  • 7,068
  • 2
  • 28
  • 50
0

From what I can tell, your appeal to the terms "step complexity" and "operation complexity" are a way of attaining a more objective definition of an alogithm's behavior, without tying it down to the actual execution time of the algorithm on an arbitrary machine.

However, "step" and "operation" are not so objective, either. Different CPUs can have vastly different instruction sets. For example, in the x86 instruction set, MOV EAX, [EDX + EBX*4 + 8] is a single instruction! That's despite it doing 1 multiplication, 3 additions, and 1 move.

Perhaps you could say "okay, well then we'll count it as 5 operations." Well that gets really complicated, too. These could very well be done in one shot by hardware, or perhaps as a decomposition of simpler instructions as orchestrated by the processor's microcode. Not only is it not defined which of these two approaches a processor uses, but the notion of these "simpler instructions" is also blurred. Who's to say multiplication is a single instruction? It could be implemented as a sequence of additions, for all you know (it's not, don't worry, but the point stands).

Anyhow, talking about "time complexity" makes sense, because we don't really care about steps or operations. What we really care about is how steps or operations can be used as a proxy for time. E.g. we don't care if one algorithm takes 10 steps or another takes 20, if the first algorithm takes 10 seconds and the other takes 2 seconds.

Alexander
  • 516
  • 2
  • 7
  • My explicit concern is this "...I just want to understand that what is the role of 'Time' in 'Time Complexity' and why we call it 'Time Complexity’...". Secondly, I guess you missed the core point. Because we don't care about CPU operations. I covered up that part saying I am assuming RAM model. Can you tell me, what concerns you more while finding element 'x' in sorted array, 1) amount of time taken by program or 2) number of steps involve? – Ubi.B Jan 13 '18 at 17:03
  • @UKB "..I just want to understand that what is the role of 'Time' in 'Time Complexity' and why we call it 'Time Complexity’..." Time does not determine time complexity, but we use time complexity to estimate time, in relative terms. – Alexander Jan 13 '18 at 17:11
  • @UKB I don't see what your assumption of the RAM model is meant to imply. If it's to standardize our definition of a "step" or "operation", then it's really rather arbitrary, and it's really just privileging one processor design over the others, arbitrarily. Why didn't you pick UTM instead? Or X86? or ARM? – Alexander Jan 13 '18 at 17:13
  • @UKB "Can you tell me, what concerns you more while finding element 'x' in sorted array, 1) amount of time taken by program or 2) number of steps involve?" Definitely 1, the amount taken. We care about the number of steps only insofar as it serves as a proxy to predict time. – Alexander Jan 13 '18 at 17:14
  • how you predict time? that would be more helpful to further understand. – Ubi.B Jan 13 '18 at 18:06
  • @UKB Well, you can only predict time when discussing a specific CPU architecture, for which you can look up the latency/throughput of each instruction. "steps"/"operations" simplifies this, by assuming that every instruction takes the same time, and has no latency. That's simply not the case, but we use it to approximate the time an algorithm might take, which is usually sufficient. Furthermore, it can be useful to discuss the relative time taken by 2 algorithms on the same platform. In that case, don't even have to know the exact instruction timing/latencies – Alexander Jan 13 '18 at 18:15
  • @UKB "Less instructions -> faster" is a heuristic just like "lighter car -> faster car". It's generally true, but there are heavy cars with big engines that can be faster than light cars with weaker engines. Similarly, there are RISC processors that might execute more instructions to perform the same algorithm as a competing CISC processor, but might complete it faster regardless – Alexander Jan 13 '18 at 18:19
  • appreciated our efforts, but it seem that your stand point is out of focus. TOC is not concern with internal working of computer organization. – Ubi.B Jan 13 '18 at 18:25
0

In practice an actual machine does not spend a particular time for each operation that is equal to any other machines length of time. Hence what we measure is a general trend of growth.

Consider addition on a machine nowadays.

Consider addition on a mechanical computer.

It is not unreasonable to use both as models to measure the length of time an algorithm takes to complete. However, the issue is that both have wildly different EXACT amounts. All we can reason is how changing the input changes the runtime of the algorithm.

A better way to consider is that runtime is a function of the length of time of each step you assume to take one unit of time. Then, time complexity is a result of analyzing that function. In particular because those times will vary from actual machine to actual machine.

I.E. give the abstract machine abstract times to complete each immutable step so that it can apply to any such machine implemented in the real world.

user64742
  • 109
  • 4