According to this source, Chaitin's constant $\Omega$ is normal.
Each halting probability is a normal and transcendental real number that is not computable, which means that there is no algorithm to compute its digits. Indeed, each halting probability is Martin-Löf random, meaning there is not even any algorithm which can reliably guess its digits.
Furthermore, the definition of normal is that each digit occurs with equal probability $1/b$. And that each duets of digits occur with $1/b^2$ probability and every triplets occurs with probability $1/b^3$ and so on.
Chaitin's omega is calculated via
$\Omega = \sum_{p \in halts} 2^{-|p|}$
Writing $\Omega$ in binary, we obtain a list of 0 and 1. For example,
2^-1=0.1 +
2^-2=0.01 +
2^-3=0.001 +
~skip 2^-4 as it does not halt
2^-5=0.00001 +
...
=\Omega
=0.11101...
Clearly, we can see that the position of each bits corresponds to the halting state of the program of length corresponding to the bit.
Here is what I am struggling with
If $\Omega$ is indeed normal, then it means that exactly 50% of programs halt and exactly 50% do not. This seems very counter intuitive.
For example, suppose I generate java programs by randomly concatenating single characters. The majority of them, I would guess more than 99.99% would not even compile. Would this not imply that at least 99.99% of them will not halt? How do we justify that exactly half will halt and exactly half will not, by virtue of $\Omega$ being normal.
Or is wikipedia incorrect about $\Omega$ being normal?