-1

The below texts are from the book Introduction to Analytic Number Theory by Apostol:

enter image description here

I have two questions which I couldn't find solutions for them:

$1-$ According to Thm 3.16., $\sum_{n\le x} \Lambda(n) \Big[ \dfrac{x}{n} \Big] = \sum_{p\le x} \ln(p) \Big[ \dfrac{x}{p} \Big] + \sum_{p\le x} \sum_{m=2}^{\infty} \ln(p) \Big[ \dfrac{x}{p^m} \Big] = \sum_{p\le x} \ln(p) \Big[ \dfrac{x}{p} \Big] + O(x),$ so by Eq. (22) we have $x \ln(x) - x + O(\ln (x)) = \sum_{p\le x} \ln(p) \Big[ \dfrac{x}{p} \Big] + O(x),$ so $x \ln(x) + O(\ln (x)) + O(x) = \sum_{p\le x} \ln(p) \Big[ \dfrac{x}{p} \Big],$ so if the Eq. (23) is true then it implies that the book has considered $O(\ln (x)) + O(x) = O(x)$ but it is not true since $O(\ln (x)) + O(x) = O(\ln (x)).$ Am I wrong?

$2-$ How to prove that the sum $\sum_{n=2}^{\infty} \dfrac{\ln(n)}{n(n-1)}$ is finite?

  • 1
    A trick for part 2. is to note that $\ln(1+x)\leqslant x$ yields $$\sum_{n\geqslant2}\frac{\ln n}{n(n-1)}=\sum_{n\geqslant2}\frac{\ln n}{n-1}-\frac{\ln n}{n}=\sum_{n\geqslant1}\frac{\ln(n+1)-\ln n}{n}=\sum_{n\geqslant1}\frac{\ln(1+1/n)}{n}\leqslant\sum_{n\geqslant1}\frac{1}{n^2}<+\infty.$$ – Did Jun 16 '16 at 07:03
  • @Did, $\ln(1+x)-x=-\frac{x^2}{2}+\frac{x^3}{3}-\frac{x^4}{4}+\cdots$; it is $\le 0$ if the terms in r.h.s. are compared/summed in pairs i.e. ${{-\frac{x^2}{2},\frac{x^3}{3}}}, \dots$ and this way is not strong since the infinite sum could be summed in different ways! Is there any better proof of why $\ln(1+x)\leqslant x$? –  Jun 16 '16 at 07:15
  • 1
    http://math.stackexchange.com/questions/1589429/how-to-prove-that-logxx-when-x1 – Marco Cantarini Jun 16 '16 at 07:48
  • @MarcoCantarini, thanks a lot :) –  Jun 16 '16 at 08:27
  • @Liebe Sure, compare the derivatives (or use the convexity if you have the heart of a geometer). – Did Jun 16 '16 at 08:56
  • $O(\log x)+O(x)=O(x)$ is true. – W. Wongcharoenbhorn Oct 08 '20 at 01:18

3 Answers3

0
  1. No, it is not true that $O(\ln x) + O(x) = O(\ln x)$. For instance, $x$ is $O(\ln x) + O(x)$, but $x \gg \ln x$ and so $x$ is not $O(\ln x)$.

  2. As $\ln x \ll x^{\alpha}$ for any $\alpha > 0$, we have that $$ \sum_{n \geq 2} \frac{\ln n}{n(n-1)} \ll \sum_{n \geq 2} \frac{n^{\alpha}}{n(n-1)} \sim \sum_{n \geq 2} n^{\alpha - 2},$$ for any small $\alpha > 0$, and which converges. Intuitively, logs are so much smaller than polynomials that they barely affect convergence.

  • Surely I would not teach my students that $u_n\ll v_n$ implies that $\sum\limits_{n\geqslant2}u_n\ll\sum\limits_{n\geqslant2}v_n$... To begin with, $\sum\limits_{n\geqslant2}u_n$ and $\sum\limits_{n\geqslant2}v_n$ are simply two numbers and what is the meaning of $a\ll b$ for some given numbers $a$ and $b$? – Did Jun 16 '16 at 18:39
0

(i). For $x\geq 1$ we have $\ln x=\int_1^x (1/y)\;dy\leq \int_1^x (1)\;dy=x-1<x.$

(ii). From (i), we have : If $x\geq 1$ and $k>1$ then $$k x^{1/k}>k(x^{1/k}-1)\geq k\ln x^{1/k}=\ln x.$$ Therefore $0\leq (\ln x)/x<1/x^{k-1}$ and conclude that $\lim_{x\to \infty}(\ln x)/x=0.$ That is, $\ln x=o(x)$ as $x\to \infty.$

$$\text {(iii). }\; n\geq 2 \implies n-1\geq n/2 \implies 1/n(n-1) \leq 2/n^2.$$ From (ii) with $k=2$ we have $0<\ln n< 2 n^{1/2}$ for all sufficiently large $n.$ So for all but finitely many $n\in N$ we have $$0<(\ln n)/n(n-1)<4/n^{3/2}.$$ And the Cauchy Condensation Test shows that $\sum_{n\in N}(1/n^p)<\infty$ for all $p>1.$

0

You might want to review the definition of big O notation. Remember that $f(x)=O(g(x))$ if and only if $\dfrac{f(x)}{g(x)}$ is bounded for all $x\geqslant a$.

Now, suppose $f_1(x)=O(\ln x),f_2(x)=O(x)$. Then $f_1(x)\leqslant M_1\ln(x),f_2(x)\leqslant M_2x$. Notice that $\ln x\leqslant x$. Then $$O(\ln x)+O(x)=f_1(x)+f_2(x)\leqslant Mx,$$ where $M=\max\{M_1,M_2\}x.$

which, by definition, implies $O(\ln x)+O(x)=O(x)$.

311411
  • 3,537
  • 9
  • 19
  • 36