6

In several posts on StackExchange and elsewhere, I have seen claims that you only need to show you can construct constants, successor, projection, composition, and minimization to prove a language Turing-complete, since primitive recursion is allegedly a special case of minimization. But as I have been thinking about it, I don't see how minimization can be used effectively at all without primitive recursion.

  1. Minimization searches for zero
  2. The only way to get zero without primitive recursion is a constant zero
  3. So any application of minimization must either:
  4. Return a constant zero all the time (since there is no way to do conditionals either w/o p.r.), or
  5. Return the input unmodified, since any number of successors other than zero would never return zero

So if every minimization must either return constant zero or zero only when the input is zero, how can you construct any function more powerful than this? Or is this a false assumption?

2 Answers2

6

Your suspicion is correct. Without primitive recursion we get a very small class of functions, namely maps of the form $f(\vec{x}) = x_i + m$, possibly with some undefined values.

Let $\mathcal{F}_k$ be the least class of partial maps $f : \mathbb{N}^k \to \mathbb{N}$ closed under $0$, successor, projections, composition and minimization.

Theorem: For every $f \in \mathcal{F}_k$ there exist $i$ and $m$ such that, for all $\vec{x} = (x_1, \ldots, x_k) \in \mathbb{N}^k$, either $f(\vec{x})$ is undefined or $f(\vec{x}) = x_i + m$.

Proof. We proceed by induction on the structure of $f$. It is obvious that $0$, successor, projections, and compositions preserve the stated property. Now consider $f \in \mathcal{F}_k$ which is defined by minimization of $g \in \mathcal{F}_{k+1}$, so that $$f(\vec{y}) = \min \{x \in \mathbb{N} \mid g(x, \vec{y}) = 0\}.$$ By induction hypothesis there are two possibilities:

  1. There exists $m$ such that $g(x, \vec{y}) = x + m$ whenever $g(x, \vec{y})$ is defined. In this case:

    • If $m = 0$ then $f(\vec{y}) = 0$
    • If $m > 0$ then $f$ is everywhere undefined
  2. There exists $m$ and $i$ such that $g(x, \vec{y}) = y_i + m$ whenever $g(x, \vec{y})$ is defined. In this case:

    • if $m = 0$ then $$f(\vec{y}) = \begin{cases}0 & \text{if $y_i = 0$ and $g(0, \vec{y})$ is defined}\\ \text{undefined} & \text{otherwise} \end{cases}$$
    • if $m > 0$ then $f(\vec{y})$ is everywhere undefined.

In all cases minimization preserves the stated property. $\Box$

Let us also try to speculate how the falacy "primitive recursion is a special case of minimization" arises. Suppose $f : \mathbb{N} \to \mathbb{N}$ is defined by primitive recursion by \begin{align*} f(0) &= k\\ f(n+1) &= h(n, f(n)), \end{align*} where $h : \mathbb{N}^2 \to \mathbb{N}$ is primitive recursive. Then there is a primitive recursive function $g : \mathbb{N} \times \mathbb{N} \to \mathbb{N}$ such that $$ g(x, y) = \begin{cases} 0 & \text{if $f(x) = y$},\\ 1 & \text{otherwise.} \end{cases} $$ The map $g$ involves a good deal of arithmetic, encoding of lists as numbers, etc. Cruicially, $g$ may be more complicated than $f$ and $h$.

Therefore, we could define $f$ by minimization as $$f(x) = \min \{y \mid g(x,y) = 0\}.$$ However, we need the entire primitive recursion machinery to get the suitable $g$, so we cannot infer from this that primitive recursion can be dispensed with in the presence of minimization.

Andrej Bauer
  • 30,396
  • 1
  • 70
  • 117
1

In Julia Robinson's "General Recursive Functions," she proves that all recursive functions can be obtained using only substitution (composition) and minimization, provided the initial functions are extended with additional members. In particular, she uses the two functions:

$S(x, y) = x + y$ $Q(x) = \begin{cases} 1 & : x = k^2 \\ 0 & : \text{otherwise} \end{cases}$

It is possible to use only substitution and minimization with these functions (plus the standard initial functions, i.e. constant, projection, and successor) to build enough number theory to write Gödel's $\beta$-function, which allows us to define finite sequences, and in turn permits the emulation of any use of p.r.