21

This may be a poorly worded question, but I hope to flesh out my ideas well. The takeaway here is this: some series diverge to infinity while others converge to a fixed value. Take the two classical examples: the harmonic series and the Basel Problem

$$\displaystyle\sum_{n=1}^\infty \frac{1}{n}= \infty \, , \hspace{0.6cm} \displaystyle\sum_{n=1}^\infty\frac{1}{n^2} = \frac{\pi^2}{6}$$

On one end, we have a divergent series, whereas on the other end, we have a convergent series, both series of which seem eerily similar, except for the square in the latter. A question I might raise would be: at what "rate of growth" (loosely speaking) does a series have to grow to tip over from the point of convergence to sudden, chaotic divergence?

Yes, in Calculus, you learn about various convergence tests that allow one to test whether a given series is convergent, but I am wondering if there is a famous "rate of growth" that a series must "exceed" in order to indisputably diverge.

SunRoad2
  • 661
  • 1
    You raise a classical (1900s) question. And an instructive one at that – A rural reader Sep 04 '21 at 03:14
  • 5
    Consider $\sum\dfrac1{n\log n}$ and $\sum\dfrac1{n(\log n)^2}$, among others. – Ted Shifrin Sep 04 '21 at 03:21
  • 2
    @A rural reader: You raise a classical (1900s) question. --- This should be 1800s. See my 10 September 1999 sci.math post on the topic. There are also several relevant MSE questions/answers, but I don't have time now to look them up. (A google search for "du Bois Reymond" and "convergence" and my name, which I knew would bring up one or more items with references, led me to the sci.math post in a few seconds.) – Dave L. Renfro Sep 04 '21 at 03:48
  • For explicit illustrative purposes, Consider that $\int_1^n (1/x) dx = \log(n)$. Also consider that between $1$ and $n$, you can construct $(n-1)$ rectangles, each of width $1$, each of whose right edge is on the point $(x = r ~: ~r \in {2,3,\cdots,n})$. Assuming that rectangle $r$ has a height of $(1/[r+1])$, you have that the collection of rectangles is completely under the curve $y = (1/x).$ This implies that $\log(n) > [(1/2) + (1/3) + \cdots + (1/n)].$ This implies (for example) that $(1 + 13.82) > [1 + \log (10^6)] > \sum_{k=1}^{10^6} (1/k).$ – user2661923 Sep 04 '21 at 04:21
  • @user2661923: I wound up stopping back here after taking care of some things and before going offline, and saw your comment, which reminded me of something I used to do in calculus 2 classes and also expanded into a job interview presentation. See this 22 March 2001 sci.math post and this 11 September 2010 sci.math post. – Dave L. Renfro Sep 04 '21 at 04:30
  • 3
  • Similar to the question, what is the positive real number closest to 0? – Dr. Wolfgang Hintze Sep 09 '21 at 12:14
  • I think there is a chapter in one of Ian Stewart's book that is related: it discusses an advanced species (of worms if I remember right) that at a certain age they have to choose a divergent series. They live as many years as the series need to sum to 10 (again if memory serves well). So the aim - if they want a long life - is to choose a divergent series that is as slowly growing as possible. – ypercubeᵀᴹ Sep 09 '21 at 20:01

1 Answers1

32

It turns out that there is no perfect dividing line: given any convergent series (of positive terms), one can find another convergent series whose terms grow faster, while given any divergent series, one can find another divergent series whose terms grow slower.

However, for practical purposes, it can be helpful to note that the series $$ \sum_{n=2}^\infty \frac1{n\log n},\; \sum_{n=3}^\infty \frac1{n\log n\log\log n},\; \sum_{n=16}^\infty \frac1{n\log n\log\log n\log\log\log n},\; \dots $$ all diverge, while the series $$ \sum_{n=2}^\infty \frac1{n(\log n)^{1+\varepsilon}},\; \sum_{n=3}^\infty \frac1{n\log n(\log\log n)^{1+\varepsilon}},\; \sum_{n=16}^\infty \frac1{n\log n\log\log n(\log\log\log n)^{1+\varepsilon}},\; \dots $$ all converge for every $\varepsilon>0$. (You can verify these assertions all using the Integral Test!)

Greg Martin
  • 78,820
  • Maybe when we interpret the word "growth" differently, we might get an answer of a slowest diverging series ever. I am reminded of functions N -> N where the Busy Beaver function is kind of one of the fastest growing functions (when suitably defining things, I guess). – ComFreek Sep 04 '21 at 12:46
  • 4
    @ComFreek the Busy Beaver function has the property of growing faster than any computable function. That's not the same thing as a "fastest" or "slowest" growing thing "ever". – Misha Lavrov Sep 04 '21 at 15:51