2

Consider a sequence of positive numbers $(a_i)_{i \in \mathbb{N}}$ such that for some $\gamma > 0$ we have $$ \sum_{i=1}^{\infty} a_i^2 i^{2\gamma} < \infty. $$ I was wondering how the tail sum $\sum_{i=n}^{\infty} a_i^2$ behave asymptotically, for $n \to \infty$.

As a heuristic, consider $a_i = i^{-\gamma - \frac{1}{2} - \epsilon}$ for some $\epsilon > 0$. We know this satisfies the assumption and you can show that $\sum_{i=n}^{\infty} a_i^2 \approx n^{-2\gamma - 2\epsilon}$ by comparing it to $\int_n^{\infty} x^{-2\gamma - 1 - 2\epsilon} dx$. So intuitively I would think that for general $(a_i)$, the tail sum can be bounded by $c\ n^{-2\gamma}$ for some $c > 0$.

Just using Cauchy-Schwartz, gives us $$ \sum_{i = n}^{\infty} a_i^2 = \sum_{i=n}^{\infty} a_i^2 i^{\gamma} i^{-\gamma} \leq \sqrt{\sum_{i=n}^{\infty} a_i^4 i^{2\gamma} \sum_{i=n}^{\infty} i^{-2\gamma}}. $$ We know the first sum is finite (since $a_i \to 0$), and the second sum behaves as $\sum_{i=n}^{\infty} i ^{-2\gamma} \approx n^{-2\gamma + 1}$, again using the integral trick. Therefore the tail behaves as $n^{-\gamma + 1/2}$. This doesn't seem to be the best possible.

You could also write $$ \sum_{i=n}^{\infty} a_i^2 = \sum_{i=n}^{\infty} a_i i^{\gamma} a_i i^{-\gamma} \leq \sqrt{\sum_{i=n}^{\infty} a_i^2 i^{2\gamma} \sum_{i=n}^{\infty} a_i^2 i^{-2\gamma}}. $$ Again, the first term is finite, and it seems that the second sum should have the desired asymptotic decay (when considering it with $a_i = i^{-\gamma - \frac{1}{2} - \epsilon}$ for example).

However I don't see how I show this last part in general. What am I missing?

2 Answers2

2

Your tail sum satisfies $$\lim_{n \rightarrow \infty }(\sum_{i=n}^{\infty}a_{i}^2)n^{2\gamma} = 0$$

To prove this, we can set $a_{i}^2 = b_{i}$ and $2\gamma = \beta$

Note;

$$\sum_{j=n}^{m} b_{j} = \sum_{j=n}^{m}b_{j}j^{\beta}j^{-\beta}. $$

Set $$B(x) = \sum_{0 \leq a \leq x} b_{a}a^{\beta}$$

with $b_{0} = 0$. By Abel Summation formula (https://en.wikipedia.org/wiki/Abel%27s_summation_formula) we have

$$\sum_{j=n}^{m}b_{j}j^{\beta}j^{-\beta} = B(m)m^{-\beta} - B(n)n^{-\beta} - \int_{n}^{m} B(x)(-\beta n^{-\beta-1})$$

Clearly we can take $m \rightarrow \infty$ to get

$$\sum_{j=n}^{\infty}b_{j} = -B(n)n^{-\beta}+\int_{n}^{\infty} B(x)(\beta n^{-\beta-1})$$ $$=\int_{n}^{\infty} \beta (B(x)-B(n)) n^{-\beta -1}$$

$$= o(n^{-\beta})$$

0

If I'm not mistaken, you won't get an answer to the question, if there is a best bound. There is this theorem, that states:

For every positive sequence $a_n$ with $\sum_n a_n < \infty$, there exists a sequence $A_n$ much larger than $a_n$ in the sense that $$\lim_{n\rightarrow \infty} \frac{A_n}{a_n} = \infty$$ with $\sum_n A_n < \infty$.

See also: $\sum_n a_n < \infty \quad \Leftrightarrow \quad \lim_{n\rightarrow \infty} \phi(n)a_n = \text{const.}$

What this implies is that there is no fixed boundary between convergent and divergent series. So in your specific case, if you could write down an expression for the tail sum $\sum_{i=n}^\infty a_i^2$, then you could also write down an asymptotic expression for $\sum_{i=n}^\infty \left(a_i i^\gamma\right)^2$, which by this theorem you obviously can not, given solely your assumption on convergence of $\sum_{i=1}^\infty \left( a_i i^\gamma \right)^2$.

For every specific $a_i$ that satisfies your statement, you can evidently find an asymptotic expansion, but what does it help? Your question is equivalent to asking, if $\sum_{i=1}^\infty a_i < \infty$ for $a_i>0$, how does $a_i$ look like?

You can however find (loose) bounds. For example, we must have $(a_nn^\gamma)^2<c$ for some constant $c$, i.e. $$\sum_{i=n}^\infty a_i^2 < c\sum_{i=n}^\infty \frac{1}{i^{2\gamma}} = O(1/n^{2\gamma-1}) \, .$$

Or with a bit more effort using summation by parts $$\sum_{i=n}^N a_i^2=\sum_{i=n}^N a_i^2 i^{2\gamma} i^{-2\gamma}=N^{-2\gamma}\sum_{i=n}^N a_i^2 i^{2\gamma} + \sum_{i=n}^{N-1} \left(\frac{1}{i^{2\gamma}}-\frac{1}{(i+1)^{2\gamma}}\right) \sum_{j=n}^{i} a_j^2 j^{2\gamma} \, .$$ Since $\sum_{i=1}^\infty a_i^2 i^{2\gamma} < \infty$, we know that the first term vanishes in the limit $N\rightarrow \infty$. Hence $$\sum_{i=n}^\infty a_i^2 = \sum_{i=n}^{\infty} \left(\frac{1}{i^{2\gamma}}-\frac{1}{(i+1)^{2\gamma}}\right) \sum_{j=n}^{i} a_j^2 j^{2\gamma} \\ \leq \sum_{j=n}^{\infty} a_j^2 j^{2\gamma} \sum_{i=n}^{\infty} \left(\frac{1}{i^{2\gamma}}-\frac{1}{(i+1)^{2\gamma}}\right) = n^{-2\gamma}\sum_{j=n}^{\infty} a_j^2 j^{2\gamma} \, ,$$ which is better than $c n^{-2\gamma}$.

Diger
  • 6,197
  • Why should we have $(a_n n^\gamma)^2 <c/n$? Let $\alpha k = \frac{1}{{\log k}}$ if $k=2^{n^2}$ and $1/k^2$ otherwise. Then $\sum{k=1}^\infty \alpha_k$ converges yet there is no $c>0$ such that $\alpha_k<c/k$ for all $k$. – Gary May 25 '22 at 08:00
  • Yeah, you are right. Didn't think of this, I edited. – Diger May 25 '22 at 10:57