12

In a problem from physics, I have to deal with this apparently simple function $$I_n=n! \int_1^\infty \frac {dx}{\prod_{i=0}^n (x+i)}$$ $(n\geq 1)$, the result of which being, for sure, something like $\log\left(\frac {p_n}{q_n}\right)$ where $p_n,q_n$ are whole numbers which become to be very large even for small values of $n$ (e.g : $p_5=67108864$, $q_5=61509375$).

For any particular $n$, the value of $I_n$ can be exactly computed (partial fraction decomposition) but I need accurate (or better, exact) results for large values of $n$.

In the past, someone proposed as a first approximation $$I_n\approx\frac{0.694}{n^{1.285}}$$ which I have been able to reproduce almost exactly curve fitting the values for $1\leq n \leq 20$. But this is too inaccurate for my application. For example, the above formula would give $I_{100}\approx 0.001868$ for an exact value $\approx 0.002011$.

Would someone have an idea either for an exact solution (probably a reccurence relation ?) or a much better approximation ?

Thanks in advance.

Edit

Thanks to heropup's exact solution and Winther's work around the asymtotics $$I_n = \frac{1}{n\log(n)}\left[1 - \frac{\gamma}{\log(n)} + \frac{\gamma^2 + \frac{\pi^2}{6}}{\log^2(n)} + \frac{\psi ^{(2)}(1)-\gamma ^3-\frac{\gamma \pi ^2}{2}}{\log^3(n)} + \cdots \right] $$ incredible progress has been done.

I give below a few numbers for illustration $$\left( \begin{array}{ccc} n & \text{exact} & \text{approx} \\ 100 & 0.00201125049 & 0.00198077851 \\ 200 & 0.00088234568 & 0.00087283768 \\ 300 & 0.00054857248 & 0.00054365188 \\ 400 & 0.00039259469 & 0.00038949113 \\ 500 & 0.00030329103 & 0.00030111383 \\ 600 & 0.00024583983 & 0.00024420737 \\ 700 & 0.00020596177 & 0.00020468064 \\ 800 & 0.00017675824 & 0.00017571890 \\ 900 & 0.00015450281 & 0.00015363806 \\ 1000 & 0.00013701199 & 0.00013627810 \end{array} \right)$$

Using the exact values for $100 \leq n \leq 1500$ (step $\Delta n=50$) and performing a linear regression based on the suggested model $$I_n\approx \frac{1}{n\log(n)} \sum_0^3\frac {a_i}{\log^i(n)}$$ leads to $$\begin{array}{clclclclc} \text{} & \text{Estimate} & \text{Standard Error} & \text{Confidence Interval} \\ a_0 & +0.99283 & 0.0005 & \{+0.992,+0.994\} \\ a_1 & -0.48997 & 0.0076 & \{-0.506,-0.474\} \\ a_2 & +1.82615 & 0.0418 & \{+1.740,+1.912\} \\ a_3 & -4.52490 & 0.0755 & \{-4.681,-4.369\} \\ \end{array}$$ giving a maximum absolute error $<10^{-8}$. This is already an incredibly good fit.

  • Do I have this correct? For $n=0, I_{0}= \int_{1}^{\infty}\frac{dx}{x}=\ln\infty-\ln1$? –  Jun 03 '16 at 08:03
  • @Bacon. $n\geq 1$. Thanks for pointing. I shall edit. – Claude Leibovici Jun 03 '16 at 08:04
  • Ah! That makes sense now, Thanks. –  Jun 03 '16 at 08:07
  • @Winther. I agree with you, for sure. This is the point where I am stuck ! Any idea will be more than welcome. Cheers. – Claude Leibovici Jun 03 '16 at 08:27
  • @Winther. Again, I agree with you. As you wrote, the problem is to simplify to .... something ! Thanks for helping. – Claude Leibovici Jun 03 '16 at 08:39
  • 1
    The sum form of the integral (as given in heropup's answer below) is treated in this answer. Unfortunately it only gives the first term in the asymptotic, $\frac{1}{n\log(n)}$, but the method used can with some work be extended to give more terms. Since the next term in the series is $\frac{1}{n(\log(n))^2}$ one probably needs several terms to get good accuracy for the cases you are interested in. – Winther Jun 03 '16 at 11:51
  • 3
    I had a go at trying to derive the asymptotic expansion. It was quite a mess. This is what I got so far $$I_n = \frac{1}{n\log(n)}\left[1 - \frac{\gamma}{\log(n)} + \frac{\gamma^2 + \frac{\pi^2}{6}}{\log^2(n)} + \frac{\psi ^{(2)}(1)-\gamma ^3-\frac{\gamma \pi ^2}{2}}{\log^3(n)} + \mathcal{O}\left(\frac{1}{\log^4(n)}\right)\right] + \mathcal{O}\left(\frac{1}{n^2\log(n)}\right)$$ This is accurate to a few percent at $n=100$ and down to $\sim 1%$ at $n=1000$ so still not good enough. – Winther Jun 03 '16 at 13:39
  • @Winther. Thank you very much. Be sure I really appreciate. What you did will be the start of further investigation on my side. I shall let you know. Thanks again. Cheers. – Claude Leibovici Jun 03 '16 at 16:04
  • @MrYouMath. Thanks for pointing the typo. It is $n$. Shame on me. Cheers. – Claude Leibovici Jun 03 '16 at 16:07
  • 1
    Did you see this question that's linked to in the answer by David Speyer that Winther linked to above? Your sum is the difference of two sums of the form discussed in that question:

    $$ \sum_{k=1}^n(-1)^{k+1}\binom nk\log(1+k)=\sum_{k=1}^{n+1}\binom{n+1}k(-1)^k\log k-\sum_{k=1}^n\binom nk(-1)^k\log k;. $$

    That doesn't yield a useful approximation because the expansion there only goes up to $O\left(\log^{-3}(n)\right)$, but perhaps you can find something useful in the answers and references given there.

    – joriki Jun 09 '16 at 01:22
  • 1
    Two more related questions: http://math.stackexchange.com/questions/114155, http://math.stackexchange.com/questions/1066144. – joriki Jun 09 '16 at 01:28
  • @joriki. Thanks for the links. – Claude Leibovici Jun 09 '16 at 02:34

1 Answers1

5

It is relatively straightforward to show that for positive integers $n$, $$f_n(x) = B(x,n+1) = \frac{\Gamma(x)\Gamma(n+1)}{\Gamma(x+n+1)} = \frac{1}{x\binom{x+n}{n}} = \sum_{k=0}^n \frac{(-1)^k \binom{n}{k}}{x+k}.$$ Consequently, $$\int_{x=1}^\infty f_n(x) \, dx = \lim_{x \to \infty} \sum_{k=0}^n (-1)^k \binom{n}{k} \left(\log (x+k) - \log(1+k)\right).$$ The limit for the upper endpoint terms of course is zero; this leaves us with $$\int_{x=1}^\infty f_n(x) \, dx = \sum_{k=0}^n (-1)^{k+1} \binom{n}{k} \log (1+k).$$ From here we could start to do things with even and odd cases of $n$. I haven't the time to look at large $n$ asymptotics but it should not be difficult.

heropup
  • 135,869
  • This is, for sure, totally correct. The problem is computing time. Thanks for answering. If you have an idea about asymptotics, this would be great for getting a new and better approximation. Cheers. – Claude Leibovici Jun 03 '16 at 08:53