Consider a particular algorithm that solves the binary search problem (or similar stuff) by performing $\sqrt{n}$ simple operations on numbers of
$\log(n)$ bits. Suppose this algorithm works on a Word Ram
machine with
word size $w=\log(n)$. What happens if the algorithm uses a memory area
with $n$ numbers that are not initialized (like when using malloc
in C
)
because only a few of them are really touched/used?
Some sources (example here) say that the memory of a Word Ram
machine is just there, and there is no allocation, i.e., we simply have $n=2^w$ memory cells available since the very beginning
. Other sources give different formalizations, but still the memory
is just there. In this sense, the algorithm has complexity $O\left(\sqrt{n}\right)$.
Other sources (example here) consider the cost of performing memory allocations, so that
allocating $O(k)$ cells takes $O(k)$. In this sense, the algorithm has complexity $O(n)$ in the Word RAM
model.
Even reference books (e.g., the Introduction to Algorithms by Cormen et al.) do
not provide a very formal description of the Word RAM
machine. In my sense, they do not
seem to give so much attention (i.e., there is about one page, Section 2.2) to
specify very clearly the machine on which all complexities of their algorithms
are calculated. So I could not find a response to my question. The Word RAM
is not as clearly and as uniformly specified as the Turing machine.
Where can I find the "true" Word RAM
machine ?