So I have function that I have written to return the maximum value between index i and index x of a given array using recursion as follows:
FindMax( A, i, j )
if (i == j):
return A[i]
else:
k = (i + j) / 2
return max( FindMax(A, i, k), FindMax(A, k+1, j) )
What is the correct way to go about calculating the big O asymptotic time complexity of this algorithm? I would appreciate if someone could document a general way to calculate the correct complexity even if it is a different recursive function. I have spent a lot of time watching videos and reading up on the web but sadly I come up with different complexities after each.