I was asked to find the O-complexity of the algorithm accepting the language {0^(2^k) | k>=0} meaning the length of a string in the language will be of a power of two. (using a turing machine)
$ The algorithm is:
1 With the head starting at the beginning of string, the headmoves right, marking every other 0.
2 If there was a single 0, accept
3 If there was more than a single 0 and the number of 0's was odd, reject
4 Return head to left
5 Repeat step one
$
So, for marking the 0's, I believe that is O(logn) because $$ n*(1/2)^x=1<=> n = 2^x <=> logn $$
where we start with n elements, cut in half x times until we have 1
I am unsure about when the cursor is moving to the start. At first, I thought that it was logn as well, but it being the same O-time as marking the 0's seems strange.
Also, I have been told: We observed that all context-free languages are decidable in polynomial time, i.e., CFL ⊆ P. Find a way to prove that in fact it is a proper subset.
Can I get a suggestion or help for this because I do not know how to.
================================================================= I believe that my question is different than those asked and answered because I am not analyzing an actual computer program; I am analyzing a turing machine, which, to me, seems more difficult because the algorithm says to traverse the beginning of the string, which is not considered when analyzing computer programs (at least I don't).