In my theoretical computer science book I have the following statement regarding the space complexity of $f(n)=2^n$:
$$\log(n) = O(f(n))$$
I can't understand how this is true, any help will be greatly appreciated.
In my theoretical computer science book I have the following statement regarding the space complexity of $f(n)=2^n$:
$$\log(n) = O(f(n))$$
I can't understand how this is true, any help will be greatly appreciated.
As stated, $\log (n) = O (2^n)$ is trivially true.
All that it says is that $\log n$, in the end, grows no faster than $2^n$. For $2^n$, you can substitute $n$, $\sqrt n$, indeed any root of $n$.
However carelessly stated, I think this really refers to the following:
Bits are a measure of space.
I made heavy weather of this before, as I thought it was referring to the time it takes to calculate $2^n$. In case it is, I'll leave this in:
For $n$ a non-negative integer, and $f(n) = 2^n$
$$ f(n) = \begin{cases} 1 &\text{for }\, n = 0 \\ (f (n \div 2))^2\times 2^{(n \mod 2)} &\text{for}\; n \gt 0 \end{cases}$$
The implied algorithm is clearly $\Theta (log \, n)$. But this is time complexity. In space complexity, it is $\Theta(1)$.