1

I was wondering when exactly the series

$$\sum_{n=0}^\infty \frac1{a_n}\quad\text{with $a_n>0$}$$

diverges, and I came up with the following guess: this happens exactly when $a_n=n^{1+o(1)}$ (i.e. exactly when $a_n$ grows slower than $n^\alpha$ for every $\alpha>1$). Is this correct?

If not, is there some other "Landau notation statement" that does the trick?

M. Rumpy
  • 975
  • 2
    Counterexamples: https://math.stackexchange.com/q/2569303/42969, https://math.stackexchange.com/q/2563446/42969. See also https://math.stackexchange.com/q/2982094/42969. – Martin R Nov 09 '22 at 12:13
  • @MartinR Awesome, thanks. Do you think there is some "easy Landau way" to denote exactly those series that diverge? – M. Rumpy Nov 09 '22 at 12:15
  • 3
    I don't think so. One reason is that for each divergent series there is another series with “significantly smaller terms” which still diverges: https://math.stackexchange.com/q/388898/42969, https://math.stackexchange.com/q/452053/42969. – Martin R Nov 09 '22 at 12:17
  • You should probably add the assumption all $a_n$ are the same sign for sufficiently large $n$, otherwise there are obvious alternating counterexamples such as this. – J.G. Nov 09 '22 at 12:22

0 Answers0