It's a well-known fact that the number of primes among the integers $\{1,\ldots, n\}$ is $\Omega\left({n \over \log n}\right)$, but it's quite difficult to prove. Are there easier proofs of the weaker fact that for some $1 > \epsilon > 0$ the number of primes is $\Omega(n^{\epsilon})$?
-
Starting with Chebyshev, there are elementary proofs that $\pi(n) \gg n/\log n$... – Bart Michels Jan 19 '18 at 16:13
-
Are they easy, though? – Artur Riazanov Jan 19 '18 at 16:18
-
Consider for example lemma 1 and 2 of the proof of Bertrand's postulate, which show that $4^n/2n\le \binom{2n}{n}\le (2n)^{\pi(n)}$. From there you get $\Omega(n/\log(n))$ if I'm not mistaken. – ArtW Jan 19 '18 at 17:02
-
Once one has the idea, a proof is easy. Having the idea took a Chebyshev, however. – Daniel Fischer Jan 20 '18 at 21:55
-
I'm not aware of an easy proof of anything between $\Omega(\log n)$ and the correct $\Theta\bigl(\frac{n}{\log n}\bigr)$. I'd be interested to see some. – Daniel Fischer Jan 20 '18 at 21:58
1 Answers
Strategy: write some known divergent series in terms of $\pi(x)$ to deduce from the bound on $\pi(x)$ that the series would converge.
Take $\sum\frac1p$. It diverges (Euler), more precisely, Mertens showed that it is $\log\log x+O(1)$, and the (elegant) proof does not use Chebyshev's estimates, only summation by parts. The method below still gives some results when we only assume a weaker form of Mertens' theorem; see bottom.
Summation by parts gives $$\sum_{p\leq x}\frac1p = \frac{\pi(x)}x+ \int_1^x\frac{\pi(t)}{t^2}dt$$ and hence an equivalent form of Mertens' estimate: $$\int_1^x\frac{\pi(t)}{t^2}dt=\log\log x+O(1)$$
Idea: Use this to deduce that at least one $t\in[1..x]$ has to satisfy $\pi(t)\geq t^\epsilon$ for some quantifiable $\epsilon$.
But this only gives a bound of the form $\pi(x) \geq t^\epsilon$. We want $t$ to be relatively large (in particular, $t=1$ wouldn't tell us anything), which is why we study the integral from $y$ to $x$ instead, $y \leq x$.
Let $y \leq x$; then $$\int_y^x\pi(t)t^{-2}dt = \log\log x-\log\log y + O(1)$$ If $\pi(t) \leq t^\epsilon$ for all $t\in[y..x]$, the LHS is $\ll \frac1{1-\epsilon}$. So there exists $t\in[y..x]$ with $$\pi(t) \gg_c t^{1-c/(\log\log x-\log\log y)}$$ for any fixed $c<1$ and $\log\log x-\log\log y \gg 1$, i.e. $y$ is at most some fixed power of $x$.
Now take $y=x^\delta$ with any $\delta <1/e$ so that $1+1/\log \delta>0$, say $\delta = 0.2$ (this is about the optimal value), to get: $$\pi(x) \geq \pi(t) \gg y^{1-c/(\log\log x-\log\log y)} = x^{\delta+c\delta/\log\delta} \geq x^{0.075}$$ by taking $c<1$ large enough.
We get that $$\epsilon = 0.075$$ works.
Using a weaker form of Mertens' theorem. Assuming only $\sum_{p\leq x} \frac1p \asymp \log\log x$, we can do the same with $\log y = (\log x)^\delta$ for some $\delta < 1$ and obtain $$\pi(x) \geq \exp((\log x)^\epsilon)$$ for some $0<\epsilon <1$.
However, I know no proof of $\sum\frac1p\ll \log\log x$ (the nontrivial bound, the other is due to Euler) apart from Mertens' theorem.

- 26,355