I have trouble determining the running time function for algorithm below. I know that there are initially two assignment operations, int i=n
and int s=0
.
Also, I know that the while loop is represented by $\lfloor \log_2n \rfloor + 1$, but the for loop depends on the value of $i$ which decrements by half.
So, the first time the for loop performs $n$ times, the second time the for loop performs $n/2$ times, and so on. How can I represent this behavior?
I suppose that, at the end, the result is the product of both behaviors.
int i = n; int s = 0;
while(i > 0) {
for(j = 1; j <= i; ++j) s++;
i /= 2;
}