The following is my own set-up code:
Search_and_Sum(key, A,n) A[1..n]
int sum = 0;
for(int i=0; i<=n+n+1000; i++){
for(int j=A.length; j>=1; j/=2){
if(key==A[j]) break;
else sum+=j;
}
}
return sum;
Well, I think the first loop has time complexity $O(n)$ and the inner loop has time complexity $O(log\ n)$, so in the worst case, the running time is $O(n\ log\ n)$.
But, I can't figure out how to find the running time in the best case. Any ideas?
Edit:
Is this is an equivalent formulation of the code with the same running time?
Search_and_Sum(key, A,n) A[1..n]
int sum = 0;
int k = A.length;
while(k>=1 && key!=A[k]){
sum=*key;
k/=2;
}
}
return sum;
i
at all, so the code is just "Do this $2n+1000$ times." Since the inner loop doesn't depend oni
, the best case occurs just when the last element ofA
iskey
. – David Richerby Mar 06 '15 at 01:10A
minimizes the expression?). See also [tag:runtime-analysis+loops].