16

Chaitin's constant is not one constant, but depends on an effective prefix-free encoding $d$ of Turing machines as bit strings. Once such an encoding is chosen, the corresponding Chaitin's constant is $$ \sum_{T\text{ halts on empty tape}} 2^{-|d(T)|} $$

Wikipedia claims, without explanation or reference, that the constant is normal.

This is at least not unconditionally true. It is easy to construct a $d$ such that Chaitin's constant is manifestly not normal in base $2$, for example we could Gödel number all of the Turing machines and set $d(T_n)=(\mathtt{00})^n\mathtt 1$. Then every other bit in the binary expansion of the constant is $0$.

So the best we can hope for is that there exists some effective encoding of Turing machines for which Chaitin's constant is normal.

Question: Is there a simple condition on the encoding that will guarantee normalcy?

  • In the french article there are more information. I'm not an expert of maths (I'm a chemist lol) but I'm french and I find this : http://fr.wikipedia.org/wiki/Om%C3%A9ga_de_Chaitin#Propri.C3.A9t.C3.A9s_des_nombres_.CE.A9 4th point and the ref J.P. Delayahe Complexités Belin 2006 p.135 hope it could help you a bit. :) – ParaH2 Jun 04 '15 at 11:30
  • There are also more ref in the french article with "a lot of" english articles ! – ParaH2 Jun 04 '15 at 11:32

1 Answers1

12

The condition is that $d$ should be optimal. Formally: A prefix-free machine is a partial computable function $M:2^{<\omega} \rightarrow 2^{<\omega}$ such that $M(\sigma)$ halts implies $M(\tau)$ does not halt for any $\tau$ comparable with $\sigma$. For such a machine $M$, Let us define $K_M(\sigma)$ to be $\min \{|\tau| : M(\tau) = \sigma\}$

A prefix-free machine $U$ is optimal if for any prefix-free machine $M$, there is a constant $c$ such that for every $\sigma$ we have $K_U(\sigma) \leq K_M(\sigma) + c$.

Then $$\Omega = \sum_{U(\sigma) \text{ halts}} 2^{-|\sigma|}$$ is a normal number in every bases, and much more: It is Martin-Löf random, that is, there is a constant $c$ such that for every prefix $\sigma$ of $\Omega$ (seen as an element of $2^{\omega}$), we have $K_U(\sigma) \geq |\sigma| - c$. It means that $U$ cannot compress the prefixes of $\Omega$. By optimality of $U$ it means that no prefix-free algorithm can compress prefixes of $\Omega$.

The key point of the demonstration lies in the fact that from the $n$ first bits of $\Omega$, one can uniformly decide if $U$ halts or not on strings of length smaller than $n$: Run $U$ on every strings in parallel and approximate $\Omega$ from below by adding $2^{-|\sigma|}$ each time $U$ halts on $\sigma$. At some point our approximation will match the $n$ first bits of $\Omega$ and we then know if $U$ halts or not on strings of length smaller than $n$.

Suppose now for contradiction that for every $c$ there exists $n$ such that $K_U(\Omega \upharpoonright n) < n - c$. We can build the prefix-free machine $M$ which on $\sigma$ does the following: $M$ runs $U(\sigma)$ until it outputs some $\tau$ (the interesting cases being when $U(\sigma) = \Omega \upharpoonright n$ for some $n$). Then $M$ tries to decide what are the strings $\rho$ of length smaller than $|\tau|$ such that $U(\rho)$ halts, by the algorithm described above. If $M$ succeeds deciding it (or think it succeeds), then it output a string different from $U(\rho)$ for any such string $\rho$.

Note that as long as $U(\sigma) = \Omega \upharpoonright n$, we must then have $K_U(M(\sigma)) \geq n$, by design. But up to some constant we also have $K_U(M(\sigma)) \leq K_M(M(\sigma)) \leq |\sigma|$ (by optimality of $U$ and definition of $K_M$). In particular $|\sigma| \geq n$ up to some constant. Note that the constant depends on the machine $M$ only and not on $\sigma$ or $n$. Now by hypothesis, the difference between $|\sigma|$ and $n$ for strings $\sigma$ such that $U(\sigma) = \Omega \upharpoonright n$ can be made as large as we want (with n > $|\sigma|$). When this difference is sufficiently large (larger than our constant), we get a contradiction.

Update : The proof that such an optimal machine $U$ exists goes as follows: Suppose you have an effective list $\{M_e\}_{e \in \mathbb{N}}$ of all prefix-free machines. Then you define $U(1^e 0 \tau)$ to be $M_e(\tau)$. Note that $U$ is prefix-free. The problem is now to get that list, from an effective list $\{N_e\}_{e \in \mathbb{N}}$ of (not necessarily prefix-free) machines (that we have): To do so, replace each machine $N_e$ with a machine $M_e$ which on input $\tau$ executes $N_e(\tau)$. If we get an answer in $t$ step of computation (we suppose $t > |\tau|$), $M_e$ then verifies that $N_e$ does not also halt in less than $t$ steps, on a string of length smaller than $t$ and comparable with $\tau$. Then $M_e$ adapts its answer to keep prefix-freeness. All that matters is that $M_e$ behaves like $N_e$ if $N_e$ is already-prefix-free, which we get, by design.

Finally, note that in the "real world", machines are prefix-free, for example in any file system, one should know where a file starts and where it ends.

  • Is it obvious that such an optimal $U$ exists at all? Or, for that matter, how to recognize one if I see it? – hmakholm left over Monica Jun 04 '15 at 13:44
  • Hm, I see -- if $U$ is "sanely universal" in the sense that its input can be a description of $M$ simply concatenated with the input to $M$ without any coding of the latter, then it will satisfy the optimality criterion. Or in other words, "optimality" of $d$ means that the s-m-n function doesn't grow faster than it reasonably has to. – hmakholm left over Monica Jun 04 '15 at 13:52
  • No it is not obvious, but not very hard to prove either. I'll update the answer. – Archimondain Jun 04 '15 at 13:56
  • Yes, I think you have the right intuition to build $U$. The hard part is to deal with prefix-free machines. – Archimondain Jun 04 '15 at 14:11
  • 1
    By the way, shouldn't it be "run $U$ on every string of every length" (dovetailing countably many runs) rather than "run $U$ on every string of length smaller than $n$"? Several terminating inputs longer than $n$ can together contribute a sum that affects the $n$th bit of the constant. What you can be sure of is that once you reach the first $n$ bits correctly, none of the strings shorter than $n$ that still run at that point are ever going to terminate. – hmakholm left over Monica Jun 05 '15 at 11:52
  • You are right, I'll correct that – Archimondain Jun 05 '15 at 22:08
  • If Omega is normal, would we expect it to have an equal number of zeros and ones? Hence, half of programs will halt and half will not? Seems strange that exactly half of all programs will halt. – Anon21 Nov 29 '16 at 12:52
  • Yes, it would ave "the same numbers of 0 and 1", but I do not see how you can conclude that half program would halt and half would not. – Archimondain Dec 04 '16 at 16:40