7

Recently I have come across a variety of truly wonderful results that deal with series that look like harmonic series:

All of the series $$\sum_{n = 1}^{\infty} \frac{1}{n^{1+1/n}}, \ \sum_{n = 1}^{\infty} \frac{|\sin(n)|}{n}, \ \sum_{n = 1}^{\infty} \frac{1}{n^{2 - \epsilon + \sin(n)}}, \ \sum_{n = 1}^{\infty} \frac{1}{n^{1 + |\sin(n)|}} $$ diverge.

The proof for the latter three mostly hinges on the fact that the fractional parts of $\{\sin(n)\}_{n \in \mathbb{N}}$ are equidistributed in the unit interval.

This leads me to ask the following question:

Let $u_i \sim Unif([0,1])$. What is the probability that $$\sum_{n = 1}^{\infty} \frac{1}{n^{1+ u_n}}$$ diverges ?

Similarly, we can ask an analogue question where $u_i$ are drawn from an arbitrary distribution $X$. I do not much knowledge in this area so any helpful comments and directions are welcomed.

  • Can you attach the link showing those sums diverge please? – mathworker21 Mar 31 '17 at 18:35
  • @mathworker21 The very first one is because $1/n^{1+1/n} > 1/(2n)$. The second one is because either $|sin(n)|$ or $|sin(n+1)|$ is greater than some constant $C$. (I found it in the book Putnam and Beyond). The third one follows from the fact that $\sin(n)$ is dense mod $1$. The fourth one is a problem from the American Mathematical Monthly. (It is somewhere on this site also). – Sandeep Silwal Mar 31 '17 at 18:44
  • The fourth one is here: http://math.stackexchange.com/questions/270064/does-the-series-sum-limits-n-1-infty-frac1n1-sinn-conve –  Mar 31 '17 at 18:46
  • To make sense of the question: should $u_i$ be $u_n$, we $(u_n)_n$ a sequence of independent r.v.s"? – Clement C. Mar 31 '17 at 18:47
  • @ClementC. Yes sorry, it is fixed now. – Sandeep Silwal Mar 31 '17 at 18:48
  • @SandeepSilwal, thanks! I assume for the third one, you mean $\sin(n)$ is uniformly distributed mod $1$. I don't see how that gets you what you want though. For example, there are divergent series but when you take a sum over a subset of the natural numbers with positive density, you get a convergent series. – mathworker21 Mar 31 '17 at 18:55
  • @mathworker21: Do you have any examples of that ? For the third series, since $2-\epsilon$ is fixed, you can copy the proof from problem 5 here: http://www.math.illinois.edu/~ajh/putnam/problems/mock07-1sol.pdf – Sandeep Silwal Apr 01 '17 at 00:42
  • @SandeepSilwal, well there are trivial examples such as $\sum a_n$ where $a_n$ is $0$ for even $n$ and $1$ for odd $n$. A nontrivial example follows from here

    https://math.stackexchange.com/questions/2206466/construct-series

    But I am curious if $\sum \frac{1}{n^{1+u_n}}$ converges whenever $u_n$ is a u.d. sequence mod $1$. The proof you showed seems to use properties of $\sin$ specifically.

    – mathworker21 Apr 01 '17 at 01:29
  • @mathworker21: Ah yes. Herm, Im not really sure for a general u.d. sequence. I think the answer should be yes but I can't think of an argument that would work off the top of my head. – Sandeep Silwal Apr 01 '17 at 14:25
  • One thing I want to point out is that ${\sin n}$ is not equidistributed. ${e^{in}}$ is equidistributed, not the ${\sin n}$. – Sungjin Kim Apr 01 '17 at 16:19

1 Answers1

4

The random variable $\displaystyle {1\over n^{1+u_n}}$ has mean value $\int_0^1 {1\over n^{1+u}}\,du\approx {1\over n\log(n)}$. Since $\sum_{n=2}^\infty {1\over n\log(n)}=\infty$ the random sum also diverges to infinity, with probability one. This follows, for example, from Kolmogorov's three series theorem.

In the three series theorem, the random series either converges with probability 1, or diverges with probability 1. And since the summands are positive, your series must diverge to $+\infty$.

  • Ok, thank you! That is exactly the type of theorem that I was looking for. To be more clear, this violates the second condition of the theorem right? – Sandeep Silwal Mar 31 '17 at 18:51
  • https://en.wikipedia.org/wiki/Kolmogorov's_three-series_theorem

    This gives many necessary conditions for it to converge almost surely. One condition is violated. How do you know it diverges almost surely?

    – mathworker21 Mar 31 '17 at 18:51
  • @mathworker21 The random sum converges if and only if all three conditions hold for all positive $A$. We can take $A=2$ here and $X_n=Y_n={1\over n^{1+u_n}}.$ –  Mar 31 '17 at 18:54
  • You don't seem to understand the flaw in logic I am pointing out. It converges almost surely iff all 3 conditions hold. One condition was violated. Therefore, it's not the case that it converges almost surely. HOWEVER, you said it diverges almost surely. How did you make that leap? – mathworker21 Mar 31 '17 at 18:56
  • 1
    There's another result of Kolmogorov, known as his "zero-one law" (https://en.wikipedia.org/wiki/Kolmogorov%27s_zero%E2%80%93one_law ), that is relevant here. Essentially you can think of it as saying that if you have an event determined by a sequence of independent variables such that the event is independent of the behavior of any finite subset of those variables, then the event must have probability $0$ or $1$. It applies here since changing a finite number of terms in a sequence doesn't affect convergence or divergence. – Kevin P. Costello Mar 31 '17 at 19:08
  • @KevinCostello Thanks for that comment. I should have made that clearer in my post. –  Mar 31 '17 at 19:41