3

I'm interested in the memory usage of various programming languages when implemented on actual hardware.

I believe that a Turing-complete programming language has, in general, unknowable memory usage since object lifetimes are not statically analyzable [1]. It also seems like a regular expression language (in the deterministic finite automata) has statically known memory use.

What is the most complex programming language for which that is true? And where do total languages [2] fall in relation to that?

[1] Given the caveat that we're implementing that programming language on a physical computer with all its requisite limitations.

[2] A total language is, for the purposes of this question, a language where all programs written in it are guaranteed to terminate.

Raphael
  • 72,336
  • 29
  • 179
  • 389
oconnor0
  • 393
  • 1
  • 8
  • 1
    What do you mean by a "language"? A set of valid strngs? A formalism for describing a computation? There is no relationship; I can create a formalism whose semantics are turing complete but whose syntax is regular (simply by adding the rule that "any other string" halts immediately.) – rici Feb 05 '16 at 02:29
  • A language used to compute in. I'm interested in semantics not syntax. – oconnor0 Feb 05 '16 at 03:54
  • 2
    I've edited your question based on your clarification. Note that programming languages have almost no overlap with formal languages. Anyway, what do you mean by a "total language"? What is your definition of "complex"? What makes you think there is a single "most complex" programming language? The question doesn't seem well-defined/well-posed as it stands. – D.W. Feb 05 '16 at 04:42
  • I understand neither what you mean by "regular expression language" (since you claim not to mean a language whose syntax is regular) nor your proposed complexity metric. -- Which seems to be what @D.W. just said. – rici Feb 05 '16 at 04:53
  • Total languages as described on https://en.wikipedia.org/wiki/Total_functional_programming and in http://www.jucs.org/jucs_10_7/total_functional_programming . Various languages have different computational powers, not all languages need be Turing complete. By restricting what can be done, simpler/less complex/more statically analyzable languages are created. For example, given the lambda calculus - but restricting general recursion - produces a less powerful language. Maybe complexity is the wrong word. – oconnor0 Feb 05 '16 at 04:55
  • @rici - A simple enough language does not need the full power of a computer. For a restricted enough language of regular expressions (no backtracking, no capture groups, etc.), a DFA rather than a Turing machine is capable of implementing it. A DFA has, I believe, known memory usage. A sufficiently complex program executing on a Turing machine does not. Where is the transition point - in language complexity - from known memory usage patterns to unknown? – oconnor0 Feb 05 '16 at 05:01
  • Known memory usage is the wrong phrase to describe what I'm after. Fully statically analyzable memory usage is closer. – oconnor0 Feb 05 '16 at 05:05
  • yes, but that's the definition of language as in "set of strings". A DFA can recognize whether a string is in the set. This says nothing about the semantics of any particular string. (And yes, a DFA has a very well-known memory usage: it consists entirely on one piece of information, which is the current state.) – rici Feb 05 '16 at 05:05
  • Right, and if all I want to do is categorize a set of strings describable by a DFA, then it's sufficient. However, I probably want to do more - like modify those strings or do some other computation with them which requires more power than a DFA. – oconnor0 Feb 05 '16 at 05:08
  • 2
    I don't think there's a useful theory of programming languages that would provide what you want. Also, one can always invent a language with known memory usage: e.g., C programs with a special version of malloc() that is guaranteed to return NULL after allocating 1MB of memory (and with bounds on recursion) would give you a language that a known upper bound on memory usage and is very complex... but somehow i doubt that's what you are looking for. – D.W. Feb 05 '16 at 05:38
  • Keep in mind that there can not be a programming language that computes exactly all total functions. – Raphael Feb 05 '16 at 09:59

1 Answers1

2

Because total programs cannot run forever they cannot use infinite memory. You can know maximum, and best case cost from looking at what algorithm is implemented. Checkout these wikipedia pages to see the relationship between different grammars, algorithms, and machines:

44701
  • 459
  • 2
  • 10
  • Interesting. Is there a way to do more fine grained analysis than maximum used? Why do total programs only implement up to polynomial algorithms? – oconnor0 Feb 06 '16 at 02:47