1

I wonder if it is widely taught, as far as heuristics go, that $\log N$ is just like $N^{1/\infty}$ (thinking something like $N^{0.00000000001}$). In other words, is there any danger in thinking this way?

I think a good test-case is deciding between $\log\log N$ and $\log^2 N := (\log N)^2$. (You may conclude that it doesn't help much.)

liuyao
  • 301
  • 2
    That is not widely taught (see the related question https://math.stackexchange.com/questions/2632349/whats-the-link-between-logarithms-and-the-zeroth-power-if-any ). What is taught is that $\log N$ is growing (unbounded) but slower than $N^{1/k}$ for all $k$ however large. Meanwhile $\log(\log N)$ grows even more slowly and the iterated logarithm slower still – Henry Mar 09 '18 at 16:10
  • @Henry : But is $\log N$ in some reasonable sense the most rapidly growing thing that grows more slowly than $N^\varepsilon$ for every $\varepsilon>0\text{?} \qquad$ – Michael Hardy Mar 09 '18 at 16:43
  • @Henry, thanks for confirming. Do people use $N^\epsilon$ instead? As heuristics go, I may perhaps venture to think (if not writing down in black-and-white) that $\log(\log N)$ is like $N^{1/2\infty}$ and $\log^* N$ is like $N^{1/N\infty}$. Disclaimer: I haven't had much experience working with such expressions, so I wanted to know what variety of expressions with $\log$ are out there. – liuyao Mar 09 '18 at 16:57
  • @MichaelHardy It depends whether you regard things like $(\log N) \log(\log N)$ or $(\log N)^2$ as reasonable things. – Henry Mar 09 '18 at 18:11
  • 1
    It can be a useful heuristic to think of $\log(N)$ as some kind of "infinitesimal" power growth, like, perhaps, $1/\infty$ (whatever you want that to mean). But, as with all heuristics, it can help you figure out what your final result should be, but you need to figure out if you can get to it using rigorous methods. – C.W Mar 09 '18 at 18:28
  • @Henry : I suspect one could make a case that that way of looking at it is absolutely correct but silly. – Michael Hardy Mar 09 '18 at 18:35
  • 1
    @MichaelHardy You are probably correct, though it is a meaningful exercise to show that $\sum_3^\infty \frac1{N (\log N) \log(\log N)}$ is infinite and $\sum_3^\infty \frac1{N (\log N)^2}$ is finite because the denominators grow at different speeds – Henry Mar 09 '18 at 18:39
  • 1
    @MichaelHardy An intermediate thing I see show up sometimes is growth like $\exp(\sqrt{\log n})$ (or some other power of $\log n$). For example, the density of the largest known subset of ${1,\dots,N}$ without a $3$-term arithmetic progression decays like (the inverse of) this. I don't recall this rate ever showing up in calculus classes though. – Kevin P. Costello Mar 09 '18 at 18:44
  • $$ \begin{align} & \int_1^N u^{\varepsilon-1} , du = O(N^\varepsilon) \text{ as } N\to\infty \text{ if } \varepsilon>0. \ \ & \int_1^N u^{\varepsilon-1} , du = O(\log N) \text{ as }N\to\infty \text{ if } \varepsilon=0. \end{align} $$ – Michael Hardy Mar 09 '18 at 18:46
  • $$ \int_1^N u^{\varepsilon-1} , du = \frac{N^\varepsilon - 1} \varepsilon \to \log N = \int_1^N u^{0-1} , du \text{ as } \varepsilon\to0. $$ – Michael Hardy Mar 09 '18 at 18:55

0 Answers0