I'm trying to prove analytically the time complexity of a problem in "Cracking The Coding Interview". The algorithm in question involves calling an O(1) operation on each node of a binary tree once for each level of nodes above it and every node in the tree is visited. The cost of such an algorithm, it seems to me would be:
$2^1(1) + 2^2(2) + 2^3(3) + ... + 2^{log_2 n}log_2(n)$ where n is the number of nodes.
I'm not sure how to evaluate the above using the usual equation for finite geometric series: $a_1\frac{(1-r^n)}{(1-r)}$ where $a_1$ is the first value and r is the constant factor by which the series grows, because the series does not seem to grow at a constant factor; r grows with each step of the algorithm. Does this sort of series have a name?