As the question states, how do we prove that $\textbf{NTIME}(f(n)) \subseteq \textbf{DSPACE}(f(n))$?
Can anyone point me to a proof or outline it here? Thanks!
As the question states, how do we prove that $\textbf{NTIME}(f(n)) \subseteq \textbf{DSPACE}(f(n))$?
Can anyone point me to a proof or outline it here? Thanks!
Here is an expanded version of Igor Shinkar's comment. The simplest way to simulate a non-deterministic machine running in time $f(n)$ and space $s(n) \leq f(n)$ uses $s(n) + 2f(n) + O(1)$ space. We enumerate over all possible coin tosses, simulating the original machine on each of them; this requires space $f(n)$ for storing the coin tosses, and $s(n)$ space for simulating the actual machine. There is a slight difficulty here: when the coin tosses are "read" by the (original) machine, we need to mark somehow where we are in the sequence of coin tosses; we can use an additional bit per coin toss. It is probably possible to optimize this even further.
If we're careful, we might be able to get something even better, since in each run of the program, the total number of coin tosses and the total space used add together to at most $f(n)$. I suspect it's possible to run the simulation in $(1+o(1))f(n)$ space. Perhaps we will need to assume something like $f(n) = \Omega(\log n)$ for that.
As Igor mentions, usually resource-bounded classes are only defined "up to big O", so that the result, which uses space $O(f(n))$, is still in $\mathrm{DSPACE}(f(n))$.