2

The syntax for first-order logic can be sorted into two kinds, namely:

Logical symbols (constants)

  • Connectives: $∧, ∨, ¬, \rightarrow, \leftrightarrow$
  • Variables: $v_{0},v_{1},v_{2},\dots$
  • Equality: =
  • Parentheses: (, )

Nonlogical symbols (parameters)

  • Functions: $f_{0}^{1},f_{1}^{1},f_{2}^{1},\dots,f_{0}^{2},f_{1}^{2},f_{2}^{2},\dots$ [$f_{n}^{k}$ to be read "the $n^{\mathrm{th}}$ $k$-ary function symbol"]
  • Predicates: $P_{0}^{1},P_{1}^{1},P_{2}^{1},\dots,P_{0}^{2},P_{1}^{2},P_{2}^{2},\dots$ [$P_{n}^{k}$ to be read "the $n^{\mathrm{th}}$ $k$-ary predicate symbol"]
  • Constants: $c_{0},c_{1},c_{2},\dots$
  • Propositional variables: $p_{0},p_{1},p_{2},\dots$
  • Quantifiers: $∀, ∃$

Disclaimer: I know that there is variation in beliefs about which what object belongs in what sort.

Suppose that we wanted to be as reductionist as possible regarding the syntax for first-order logic. What are the absolute minimal assumptions necessary to set up the syntax for first-order logic? In other words, which types of symbols in the language specified above can be removed or defined in terms of others?

Here is what I think I understand so far:

  1. Constants can be identified with $0$-ary functions.
  2. Propositional variables can be identified with $0$-ary predicates.
  3. Parentheses can be avoided altogether by adopting certain formalisms (e.g., polish notation).
  4. Equality can be identified as a $2$-ary predicate.

And, although they don't get rid of any type of symbol, we can also consider:

  1. Replacing all the current connectives with a sole-sufficient operator (e.g., nand ($↑$), or nor ($↓$)).
  2. Removing a quantifier and defining it in terms of the other (e.g., $∃ x\phi \leftrightarrow ¬∀x¬\phi$).

So if we apply these our syntax will look like:

Logical symbols (constants)

  • Connective: $↑$
  • Variables: $v_{0},v_{1},v_{2},\dots$

Nonlogical symbols (parameters)

  • Functions: $f_{0}^{0},f_{1}^{0},f_{2}^{0},\dots,f_{0}^{1},f_{1}^{1},f_{2}^{1},\dots,f_{0}^{2},f_{1}^{2},f_{2}^{2},\dots$
  • Predicates: $P_{0}^{0},P_{1}^{0},P_{2}^{0},\dots,P_{0}^{1},P_{1}^{1},P_{2}^{1},\dots,P_{0}^{2},P_{1}^{2},P_{2}^{2},\dots$
  • Quantifier: $∀$

Which looks bad (in the way we want), but I find it hard to believe we can't make it worse than this. From this point I am not exactly sure what else we can remove uncontroversially, but I have a few naïve comments:

(Where $U$ is meant to be the universe that $∀$ quantifies over).

  1. A $k$-ary function (the standard meaning in this context) is a function from $U^{k}$ into $U$.
  2. A $k$-ary predicate is a $k$-ary Boolean-valued function, i.e., function from $U^{k}$ into $\{0,1\}$.
  3. A $k$-ary connective is a $k$-ary Boolean function, i.e., a function from $\{0,1\}^{k}$ into $\{0,1\}$.

It feels like these can all collapse into one notion, but the jury is still out on this.

Nika
  • 727
  • 1
    The number of variables can also be reduced to 3 or 4 to obtain the same expressive strength. The function, predicate and constant symbols are not all necessarily part of the language: a given subset of them is called a signature. – Berci Jan 19 '20 at 02:17
  • 1
    @Berci How exactly could we reduce the currently arbitrarily large stock of variable symbols down to 3 or 4? And yes you are right that the any subset of all of the function, predicate, and constant symbols (in fact all of the nonlogical symbols) are part of the signature for a first-order language, but even if a subset of them make up a signature of a first-order language the whole of them are part of the language of first-order logic, no? – Nika Jan 19 '20 at 02:33
  • 1
    If you really want minimal, just let $L(E)$ be the lexicographical index of expression $E$ (in any language), and writing that number in unary you can describe all of FOL with just 1 symbol. – DanielV Jan 19 '20 at 03:30
  • 2
    @Berci I'm confused about the claim that you can get away with only 3 or 4 variables. Over general structures, the hierarchy of finite variable logics does not collapse. Maybe you mean that first-order set theories can be formulated using just a few variables? (Enough to define a pairing function, which can then be used to encode longer tuples of variables.) – Alex Kruckman Jan 19 '20 at 04:50
  • 1
    @AlexKruckman Yes, indeed, it was specifically about set theory, using the pairing function, etc. – Berci Jan 19 '20 at 08:56
  • 1
    @DanielV I've thought about that before but then it requires that you have a language prior to developing the one for FOL. Also, I am trying to reduce the amount of types of symbols being used, not necessarily the amount of symbols (although I will flirt with both options). – Nika Jan 19 '20 at 13:53
  • 1
    I would be a little concerned about treating equality as one of the two-place relations/predicates, because equality has a fixed semantics, while the semantics for predicates can be changed. – Bram28 Jan 19 '20 at 17:08
  • 1
    @Bram28 Pragmatically I think I understand what you are saying, but what are you suggesting would go wrong by including it as a predicate and then excluding any result which makes use of a semantics for equality that is not the one we want it to be? – Nika Jan 19 '20 at 17:19
  • 1
    @Nika I suppose you could do that ... but it seems a bit disingenuous to say that equality is just like the other predicates ... and yet it isn't. – Bram28 Jan 19 '20 at 20:15
  • 1
    @Bram28 I could be wrong but it seems to me like the only sense in which it isn't like the other predicates is in the fact that we all tend to find it impolite if we see the symbol '=' used in such a way that it has different properties from what we typically assume that symbol to have. Do you have the same reservations about the part of my post where I suggested that the connectives ($∧, ∨, ¬$, etc.) be considered functions (in other words, moving them out from 'logical constants' and into 'parameters')? – Nika Jan 19 '20 at 20:27
  • 1
    @nika All of first order logic can be fairly easily described using only 3 combinators (the usual S and K combinators, as well as one $U$ representing $UAB = \lnot \exists x ~.~ Ax \lor Bx$ or something along those lines. Is that what you are interested in? – DanielV Jan 19 '20 at 20:31
  • 1
    I think what I am really trying to get at in my question is finding a way to get predicates and functions (and ideally also connectives) to collapse into one notion, so that if I was I had to give a rushed sales pitch on first-order logic I would only need to explain what variables, quantifiers, and ???s are. If quantifiers and variables can also be collapsed into one notion in such a way that we don't have to (re)introduce anything else, that would be another satisfying idea. – Nika Jan 19 '20 at 20:54

1 Answers1

3

The issue that @Bram28 brings up regarding equality is not a trivial one; you cannot just treat it as a 2-input predicate-symbol and add axioms for it. FOL equality obeys substitution (a.k.a equality-elimination). However, this cannot be expressed as a fixed axiom schema. To recover the same proof-theoretic capability as standard FOL, using FOL without equality, you need to add axioms for each function/predicate-symbol, to force the equality-predicate to behave in the desired manner. And that defeats the purpose of minimization, because it makes absolutely no difference whether you have the usual equality-symbol with rules governing it, or instead use a 2-input predicate-symbol with axioms governing its interaction with other symbols. In fact, in my opinion it is more minimal to have equality separate from the other symbols, instead of cluttering the axioms just to force one specific predicate-symbol to behave differently. This is even worse if you want your FOL to be as strong as the usual FOL (where the language can be of arbitrary size), because then your axioms for the equality-predicate would have to be described using a meta-rule, which is no better than having an equality-symbol with the standard rules.

Concerning your indexing of the symbols by natural numbers, this is indeed one way to do it if you only want to construct FOL for countable languages. This is strictly less powerful than the standard FOL, and you cannot use some very useful theorems about FOL such as compactness on uncountable theories. Such applications of FOL have important results, including in model theory. For example, the atomic diagram of an uncountable structure is uncountable, and we often want to apply compactness to the atomic diagram plus some extra formulae. We will not be able to do this without FOL for uncountable languages.

Here is a less important but fun application of mere propositional logic that needs uncountably many variables: If the euclidean plane has no $k$-colouring where every pair of points with unit distance have different colours, then there is in fact a finite subset of the plane with no such $k$-colouring, by the compactness theorem for propositional logic. This reduces the Hadwiger-Nelson problem to a somewhat more finitary problem.

That said, if you want a computable deductive system for FOL, then yes you can restrict the language to some computable syntax. As I stated earlier, you should not attempt to eliminate equality, but you can safely remove either function-symbols or predicate-symbols. It is clear that you can replace each $k$-input function-symbol $f$ by a $(k+1)$-input predicate symbol $P$, by adding an axiom $∀x[1..k]\ ∃!y\ ( P(x[1..k],y) )$ and rewriting every subformula of the form "$f(t[1..k]) = u$" as "$P(t[1..k],u)$". I will leave it as an exercise for you to show that you can alternatively replace predicate-symbols by function-symbols. (Hint: Use two new constant-symbols for the truth-values, and translate each subformula involving a predicate-symbol into an equality.) As for constant-symbols, you are right that they are nothing more than 0-input function-symbols.

But you are wrong that parentheses can be avoided using Polish notation. It is necessary to have some form of scoping for quantifiers, and brackets are obviously the easiest syntactic way.

Although it is true that just one boolean connective suffices (NAND or NOR), and that we can express ∃ in terms of ∀, such kind of minimization is in fact contrary to the true objective of a computable deductive system, because the main reason for desiring a computable syntax is so that it is actually practical (can be used in the real world). For this reason, it makes no sense to minimize the number of boolean connectives and the number of quantifiers. Having too many primitives would make the system bloated, but having too few would make the system useless. It is the same with axioms; it is better to have axioms that make sense rather than simply a 'minimal' set of axioms. For instance, Łukasiewicz found that a single propositional template, $((P → Q) → R) → ((R → P) → (S → P))$ axiomatizes the propositional fragment of FOL, but it is just a curiosity with no practical value, and we should not use it in place of the typical rules or axioms 'just because we can'!

It may be desirable to design a minimal core system with few primitives, on top of which you build a usable system with the usual useful primitives, but there is an important point to be made here. Ultimately, if you want a useful system, its (external) interface must be easy to use, regardless of what its (internal) implementation is, and the user should not see or have access to any of the implementation details. This interface/implementation separation is a key concept in robust and scalable software design, but it applies to mathematical tools as well. Here are some posts that go into more detail regarding this issue:

  1. Abstraction of natural numbers and real numbers via structural properties.

  2. Most mathematicians do not consider some things as sets, such as symbols and algorithms, because how they are encoded is irrelevant to their behaviour.

  3. Every theorem in real analysis is about every model of the second-order axiomatization of the reals, not just some particular construction of the reals. Similarly, every theorem in complex analysis is about every algebraic closure of every model of the real axioms.

In exactly the same spirit, every formal system that we would like to use for practical logical reasoning must be able to support all the standard boolean operations and quantifiers. From the user's perspective, one should not even think about minimization of syntax. Rather, the system should feel 'clean' and friendly, and what we mathematicians want to do should be easy to do. You should try proving basic theorems of PA in whatever formal systems you design, such as "$∀x,y∈\mathbb{N}\ ( x·x = 2·y·y ⇒ x = 0 )$", to get a good feel for what kind of system is practical and what kind is impractical.

Remember that the kind of minimization that is desirable should match the intended goals. Minimization just for the sake of minimization may be fun, but nothing much beyond that.

user21820
  • 57,693
  • 9
  • 98
  • 256