1

I was thinking about this question: Why does the series $\sum_{n=1}^\infty\frac1n$ not converge?

and investigating the rate of decrease of the members of the series (any decreasing series, not just harmonic ones) at which the sum remains bounded and doesn't diverge. I was looking at different series and trying to figure out the criteria for determining if a sum of subsequent members can reach the first member (for harmonic series the sum of $2^{n-1}$ subsequent members will always reach $\frac{1}{2}$).

While looking at that, I came to the following intuitive criteria for determining the convergence of the series:

For any infinite series $\sum_{n=1}^{+\infty}\frac{1}{f(n)}$ where $f(n)$ is monotonically-increasing function, if $f(n)$ grows faster than $const\cdot n\cdot \ln n$ then the series converges, otherwise the series will diverge. In other words, if $f''(n)>\frac{const}{n}$ the series converges; otherwise, it diverges.

Notice: initially my criteria was $f''(n)>0$, but based on comments, I amended it to $f''(n)>\frac{const}{n}$.

It is easy to see that if $f''(n)\le0$ then the series diverges, but it appears that even if $f''(n)$ is positive, but still dropping inversely proportionally to n, then the series will still diverge. For example, the series will diverge if $f(n)=100\cdot n\cdot \ln n $, which is equivalent to $f''(n)=\frac{100}{n}$.

In order for series to converge, the positive $f''(n)$ has to grow with n, remain the same, or drop with n slower than inversely proportionally. For example if $f''(n)=\frac{1}{\sqrt{n}}$ then $f(n)$ grows fast enough and the series will converge.

I looked at many examples of $\sum_{n=1}^{+\infty}\frac{1}{f(n)}$, where $f(n)$ grows exponentially and logarithmically, for example: $n^2, n^{1.1}, 2^n, \ln n, n\ln n, \sqrt{n}, \frac{\sqrt{n}}{\ln n}$ etc... and never could disprove my criteria.

I am not a mathematician (only have minors in physics), and I don't know if what I have found is already a well known thing, but I couldn't find any criteria for convergence similar to the one I described above. So I am asking the math community to either disprove my criteria, showing $f(n)$ that doesn't obey my criteria, or prove my criteria, or give any input why it looks right or wrong.

  • 2
    For the superlinear case, consider the generalized Bertrand series (look it up in this article). For the simplest case, $f(n)=n\log n$, with "second derivative" $1/n>0$, while the series is divergent. Note that $f$ is not even supposed to be continuous, it only has to be defined on integers. – Jean-Claude Arbaut Jan 16 '24 at 18:00
  • I think it is pretty well known that $1 \over n^{1+ \epsilon}$ converges. – Older Amateur Jan 16 '24 at 18:03
  • @ Jean-Claude Arbaut - you have misread my question - the second derivative of n, not 1/n - that's where your mistake is, isn't it? – user1390208 Jan 16 '24 at 18:05
  • 1
    @user1390208 You misread my comment then. The series $\displaystyle{\sum_n \frac{1}{n\log n}}$ is divergent, while the function $f(n)=n \log n$ has second derivative $f''(n)=1/n>0$. – Jean-Claude Arbaut Jan 16 '24 at 18:06
  • @ Jean-Claude Arbaut - I did misread your comment, you are right. And Yes, you have disproved my criteria! – user1390208 Jan 16 '24 at 18:28
  • @ Jean-Claude Arbaut - So yes, n*log(n) disproves my criteria, I thought I have tested this function, but it appears I have not. But would you agree that my criteria is in the right direction? What if we say that the series will converge when f''(n) > 1/n? Or, at least, if f(n) grows faster than the denominator of Bertrand series? – user1390208 Jan 16 '24 at 18:58
  • You should really have a look at the generalized Bertrand series. Can you prove all cases (with $\alpha=1$) have ultimately $f''>0$? You may as well have a look at other convergence criteria. You approach is not necessarily bad, but you will have to find a bound on the growth of $f''$ ($f''=1$ works, $f''=1/n$ doesn't). And since it assumes $f$ is twice differentiable, the application of the resulting test is more limited than other ones. – Jean-Claude Arbaut Jan 16 '24 at 19:43
  • Note also that while the series diverges for $f(n)=n\log n$, it converges for $f(n)=n\log^a n$ for any $a>1$ (see the generalized Bertrand series for series that converge or diverge even more slowly). See if it can lead to some useful condition for arbitrary $f$. The test also has to be practical, if it only works in a few cases it may not compare well to other more general tests. – Jean-Claude Arbaut Jan 16 '24 at 19:47
  • @ Jean-Claude Arbaut - Thank you very very much! You helped me a lot and answered my question. I wasn't aware of Cauchy condensation test at all! And yes, this test and Bertrand series are exactly what I need to research. I am not a mathematician however, so I do not possess sufficient instruments to prove things. From your last comment it does seem that the series will converge if f''(n) > 1/n. I will read further and try to understand. – user1390208 Jan 16 '24 at 20:01
  • A last comment: for a given sequence $u_n$, you can find an interpolating function such that $f(n)=u_n$, $f$ is twice differentiable (or $C^k$ for an arbitrary $k$), and $f''(n)=0$, or any arbitrary value (use piecewise polynomials to prove this, for instance). This means that if you have a condition on $f''(x)$, it must apply to all $x>x_0$ and not just to integer $x$ (otherwise the condition proves nothing at all). – Jean-Claude Arbaut Jan 16 '24 at 20:41

1 Answers1

0

It can be seen that a very simple convergence criteria follows from the above:

For any infinite series $\sum_{n=1}^{+\infty}\frac{1}{f(n)}$ where $f(n)$ is monotonically-increasing function, $\\$ if a limit $\lim\limits_{n\to\infty}\frac{f(n)}{n\ln n} = \infty$ then the series converges, otherwise the series will diverge.

However, the above criteria doesn't work, because $\sum\limits_{n=2}^\infty \frac {1} {n\ln n\ln(\ln n)}$ diverges.

Therefore, I cannot find a "border" function - the fastest growing $f(n)$ for which the series will diverge, so that it could be a reference function - all other functions that grow faster than it, will start converging. I wish there could be such a function.