2

What is the complexity of the follwoing recurrence? $$T(n) = T(n-1) + 1/n$$

I highly suspect the answer to be $O(1)$, because your work reduces by $1$ each time, so by the $n$th time it would be $T(n-n) = T(0)$ and your initial task reduces to 0.

Soham Chowdhury
  • 228
  • 1
  • 6
Olórin
  • 859
  • 2
  • 11
  • 21
  • @DavidRicherby has right. Actually, this question is the example for non-decreasing $f(n)$ in $T(n) = T(n-1) + f(n)$ http://cs.stackexchange.com/a/24082/1152 – Jonathan Prieto-Cubides Dec 17 '14 at 12:28
  • 3
    Problems have complexity, algorithms have running times, (mathematical) functions have growth rates. Are you asking about the growth rate of the mathematical function $T(n)$ or the difficulty of computing it? – David Richerby Dec 17 '14 at 14:36
  • 2
    I have no idea how these things are being taught today. However I am rather surprised by this apparent misuse or abuse of the word "complexity", and by the fact that no one is reacting to it. From what I understand, this can only reinforce the complete conceptual mishmash that seems to encumber the brains of too many students. As far as I can tell, this has nothing to do with complexity, and the study of asymptotic limits and of Landau notation (Big O and its brothers and sisters) is not the study of complexity, but only a tool for it, and for other purposes. (simultaneous @DavidRiche – babou Dec 17 '14 at 14:42
  • 2
    @babou Yes, this is indeed widespread. Even research papers talk about "complexity of this algorithm". – Raphael Dec 17 '14 at 17:14
  • 2
    @babou Everyone thinks "Oh, it's $O(\text{something})$, it must have something to do with time complexity." – Soham Chowdhury Dec 17 '14 at 17:25
  • 2
    @Raphael You are right. However it bothers me a bit less as the concepts are very close, complexity being about the cost of algorithms that solve the problem. But asymptotic analysis is just a piece of math that has lots of other uses. Still, being precise and using the right words is essential when doing science. – babou Dec 17 '14 at 17:39
  • http://bigocheatsheet.com/ is wrong? – Olórin Dec 18 '14 at 00:57
  • http://stackoverflow.com/questions/11032015/how-to-find-time-complexity-of-an-algorithm is wrong? – Olórin Dec 18 '14 at 00:58

2 Answers2

10

By unfolding $$T(n) = T(n-1) + \frac{1}{n} = T(n-2) + \frac{1}{n} + \frac{1}{n-1} = \dots=T(0) + \sum_{k=1}^{n} \frac{1}{k}$$ Now we can easily approximate the sum on the RHS using that $$\sum_{k=1}^{n}\frac{1}{k} \le 1 + \int_{1}^{n}\frac{1}{x} dx = 1+ \log{n} - \log{1} = 1+ \log{n}$$ Therefore $T(n) \equiv O(\log{n})$

0xdeadcode
  • 128
  • 1
  • 6
  • The answer would be better if it criticized the misuse of the word complexity. – babou Dec 17 '14 at 14:47
  • 3
    The inequality is now wrong. In fact, $\sum_{k=1}^n \frac{1}{k} = \log n + \gamma + O\left(\frac{1}{n}\right)$, where $\gamma \approx 0.577 > 0$ is the Euler–Mascheroni constant. – Yuval Filmus Dec 17 '14 at 16:27
  • Thanks again @Yuval Filmus. The inequality is correct now. :) – 0xdeadcode Dec 17 '14 at 17:30
1

Edit

This answer assumed that the OP was looking for the time-complexity of an algorithm that would evaluate the recurrence, so it's probably wrong.


Whatever you do, you have to iterate $n$ times, whatever the base case/starting value. So as $n$ grows without bound, the number of operations also grows linearly with $n$ - implying that the recurrence is $O(n)$.

Look at it this way: To calculate $T(n)$, you need $T(n-1)$, which in turn depends on $T(n-2)$, and so on all way till $T(0)$ (or whatever the lowest allowed value of $n$ is).

Each time, you calculate $\frac{1}{k}$ and add it to the next lower value of $T(k)$, doing $O(1)$ work each time, $n$ times - which adds up to $O(n)$ complexity.


In fact, you can easily represent $T(n)$ in general as

$$ T(n) = T(0) + \sum^n_{k=1}{\frac{1}{k}} $$

(which, if you're curious, is equal to $T(0) +H_n$, where $H_n$ is the $n$th harmonic number.)

Soham Chowdhury
  • 228
  • 1
  • 6
  • 1
    Taking into account the asymptotics of the harmonic numbers, the final answer is $T(n) = \log n + O(1)$. – Yuval Filmus Dec 17 '14 at 06:30
  • When writing my answer I assumed that OP was looking for the complexity of an algorithm that would calculate the answer, so . . . you're correct. – Soham Chowdhury Dec 17 '14 at 09:31
  • 1
    I guess you computed the cost of a given algorithm, rather than the complexity of the problem it solves. But it is as good an interpretation of the misuse of the word complexity as any other. – babou Dec 17 '14 at 14:56