2

Sorry if this question has already been asked, but it's a little difficult to look things up in Google if the statement of the problem is not very simple and involves symbols that Google doesn't recognize.

The question I have regards the zeta function. If $z_n$ is the sequence of non-trivial zeros of the zeta function with positive imaginary part and sorted by ascending imaginary part, what is the limit when $n$ goes to infinity of $\Im{(z_n)}$?

Does this explode out to infinity or is it finite?

Asking for a friend (paper here). He has derived a new super simple equation whose solution is equivalent to the Riemann hypothesis.

2 Answers2

6

The Riemann-von Mangoldt formula asserts that the number of zeroes of the form $\frac{1}{2} + it$ where $t \in [0, T]$ is asymptotically

$$\frac{T}{2\pi} \log \frac{T}{2\pi} - \frac{T}{2\pi} + O(\log T)$$

from which it follows that $\text{Im}(z_n)$ grows something like $\frac{2 \pi n}{\log n} \left( 1 + \frac{\log \log n}{\log n} \right)$, but I haven't been too careful about that calculation.

Large tables of zeros are available to double-check this asymptotic against; for example, the millionth zero has imaginary part $\approx 600269$ whereas the asymptotic above gives $\approx 541230$, so it's a bit of an underestimate.

Working a bit more carefully, write $\text{Im}(z_n) = \frac{2 \pi n}{\log n} \left( 1 + e_n \right)$, where $e_n \to 0$ (slowly). Then to match the asymptotic above we need

$$\frac{n}{\log n} (1 + e_n) \log \left( \frac{n}{\log n} (1 + e_n) \right) - \frac{n}{\log n} (1 + e_n) = n + O(\log n).$$

Dividing by $\frac{n}{\log n}$, expanding out, and canceling the dominant term from both sides gives, after some simplification,

$$e_n \log n + (1 + e_n) \log (1 + e_n) - (1 + e_n) \log \log n - (1 + e_n) = O \left( \frac{(\log n)^2}{n} \right).$$

In order for the LHS to have limit $0$ as $n \to \infty$ we see that we need $e_n \approx \frac{\log \log n + 1}{\log n}$. This is already a noticeable improvement; it improves the estimate of the imaginary part of the millionth zero to $\approx 574149$. To do better than this we'll estimate

$$\log (1 + e_n) = e_n + O(e_n^2)$$

(keeping in mind that $O(e_n^2)$ is $O \left( \left( \frac{\log \log n}{\log n} \right)^2 \right)$ which is quite a bit slower than $O \left( \frac{(\log n)^2}{n} \right)$ so this is not best possible), which means the LHS becomes, after some simplification,

$$\left( e_n \log n - \log \log n - 1 \right) - e_n \log \log n + O(e_n^2)$$

so we can improve our estimate some more to $e_n \approx \frac{\log \log n + 1}{\log n - \log \log n}$. This is again a noticeable improvement; now the estimate for the imaginary part of the millionth zero is $\approx 602157$. We have two digits of accuracy now! Altogether, then,

$$\boxed{ \text{Im}(z_n) \approx \frac{2 \pi n}{\log n} \left( 1 + \frac{\log \log n + 1}{\log n - \log \log n} \right) }$$

and with a little more effort one could give a big-$O$ description of the error in this approximation but I'll stop here.

Qiaochu Yuan
  • 419,620
3

This just reports some (old) empirical results.

Many years ago, in my research group, the same question came and one of my Ph.D. developed a simple empirical correlation $(R^2=0.999991 )$ $$\log \left(\Im\left(\rho _{2^k}\right)\right)\sim a+b \,k^c$$

For $1 \leq k \leq 23$, this gave $$\begin{array}{clclclclc} \text{} & \text{Estimate} & \text{Standard Error} & \text{Confidence Interval} \\ a & 2.72774 & 0.02399 & \{2.67752,2.77795\} \\ b & 0.27581 & 0.00566 & \{0.26396,0.28767\} \\ c & 1.21848 & 0.00627 & \{1.20535,1.23161\} \\ \end{array}$$

from which the estimate of the imaginary part of the millionth zero is $ 595894$ instead of $600270$.

$$\left( \begin{array}{ccc} n & \text{estimate} & \Im\left(\rho _{10^n}\right) \\ 1 & 50.3377 & 49.7738 \\ 2 & 244.508 & 236.524 \\ 3 & 1436.66 & 1419.42 \\ 4 & 9672.79 & 9877.78 \\ 5 & 72559.8 & 74920.8 \\ 6 & 595894. & 600270. \\ 7 & 5292950 & 4992381 \end{array} \right)$$

Edit

Using @Qiaochu Yuan's answer, we could inverse

$$\frac{T}{2\pi} \log \frac{T}{2\pi} - \frac{T}{2\pi} + O(\log T)$$ and get $$\Im\left(\rho _{n}\right)\sim \frac{2 \pi n}{W\left(\frac{n}{e}\right)}$$ where $W(.)$ is Lambert function.

Using its usual series expansion, $$\Im\left(\rho _{n}\right)\sim \frac{2 \pi n}{L_1-L_2+\frac{L_2} {L_1}+\frac{L_2(L_2-2)}{2L_1^2}+\cdots }$$ where $L_1=\log(n)-1$ and $L_2=\log(L_1)$. For $n=10^6$, this would give $600219.$

If you look at the paper by G.Franca and A.LeClair, equation $(163)$ gives sharp bounds $$\frac{2 \pi \left(n-\frac{7}{8}\right)}{W\left(\frac{n-\frac{7}{8}}{e}\right)} \leq \Im\left(\rho _{n}\right) \leq \frac{2 \pi \left(n-\frac{3}{8}\right)}{W\left(\frac{n-\frac{3}{8}}{e}\right)}$$