3

I am reading a real analysis book which in the chapter about sequences asks first to prove that $\lim\limits_{n\to\infty} \frac{\log(n)}{n}=0$ for $a\in\mathbb{R}^+\setminus\{1\}$ and then asks to prove that $\lim\limits_{n\to\infty}\frac{\log_a(n)}{n^b}=0$ for $a\in\mathbb{R}^+\setminus\{1\}$ and $b\in\mathbb{R}^+$ by deducing it from the first result.


I have proved the first result in the following way.

Let $a\in\mathbb{R}^+\setminus\{1\}, a>1$. We first note that $\frac{\log_{a}(n)}{n}=\log_a(\sqrt[n]{n})\geq 0$ for all $n\geq 1$. Now, since $\lim\limits_{n\to\infty}\sqrt[n]{n}=1$ it follows that however we choose $\varepsilon>0$ if we pick some $\delta>0$ such that $a^\varepsilon -1>\delta$ there exists $N\in\mathbb{N}$ such that $1+\delta>\sqrt[n]{n}\geq 1$ for all $n>N$ so, since $\log_a(1+\delta)<\varepsilon\Leftrightarrow 1+\delta<a^\delta\Leftrightarrow \delta<a^\varepsilon -1,$ it follows that $\varepsilon>\log_a(1+\delta)>\log_a(\sqrt[n]{n})\geq\log_a(1)=0$ for all $n>N$. In summary, however we choose $\varepsilon>0$ there exists $N\in\mathbb{N}$ such that $|\log_a(\sqrt[n]{n})-0|<\varepsilon$ for all $n>N$ thus, by definition of limit of a sequence, we have that $\lim\limits_{n\to\infty}\frac{\log_a(n)}{n}=0.$

If $0<a<1$ then $\frac{\log_a(n)}{n}=\frac{\log_{10}(n)}{n}\cdot\frac{1}{\log_{10}(a)}\xrightarrow[]{n\to\infty}0\cdot\frac{1}{\log_{10}(a)}=0.$


I have managed to prove the second result for $b\geq 1$ as follows.

Let $a\in\mathbb{R}^+\setminus\{1\}$ and $b\in\mathbb{R}^+$. We have already proved the claim for $b=1$ so it remains to study the cases $b>1$ and $0<b<1$. If $b>1$ then $\frac{\log_a(n)}{n^b}=\frac{\log_a(n)}{n}\cdot\frac{1}{n^{b-1}}\xrightarrow[]{n\to\infty}0\cdot 0$ since $\lim\limits_{n\to\infty}\frac{\log_a(n)}{n}=0$ and $\lim\limits_{n\to\infty}\frac{1}{n^\gamma}=0$ for $\gamma>0$.


I am finding it difficult to handle the case $0<b<1$ using only the tools given in the corresponding chapter of the book, namely knowledge of the standard limits, the squeeze theorem and the algebra of limits of sequences. By trying the same trick I used in the proof of the case $b>1$ I get $\lim\limits_{n\to\infty}\frac{\log_a(n)}{n^b}=\lim\limits_{n\to\infty}\left(\frac{\log_a(n)}{n}\cdot n^{1-b}\right)=0\cdot\infty$, an indeterminate form. So I would be grateful if someone would give me an hint about how to prove this fact using only the first limit and the tools about limits of sequences that I have mentioned. Thanks.

lorenzo
  • 4,032
  • 3
    Hint: $ \frac{\log_a(n)}{n^b}=\frac1b \frac{\log_a(n^b)}{n^b}.$ – Anne Bauval Sep 05 '23 at 19:45
  • @AnneBauval I had thought about that, but I didn't know how to rigorously justify the change of variables $\lim\limits_{n\to\infty}\frac{1}{b}\cdot\frac{\log_a(n^b)}{n^b}\overset{m:=n^b}{=}\lim\limits_{m\to\infty}\frac{1}{b}\cdot\frac{\log_a(m)}{m}=\frac{1}{b}\cdot 0=0$. – lorenzo Sep 05 '23 at 19:48
  • 1
    There is indeed a small problem because $n^b$ is not an integer. You should first use your hypothesis about the limit of the sequence $\frac{\log(n)}n$ to derive the (same) limit for the function $\frac{\log(x)}x.$ Hint: for $x>e,$ this function is monotonic. – Anne Bauval Sep 05 '23 at 19:53
  • @AnneBauval I had also thought about using the limit of the corresponding function, but... limits of functions are defined only in the next chapter of the book, so I cannot use them here. – lorenzo Sep 05 '23 at 19:57
  • 1
    Then, argue directly that (for $n$ large enough) $0<\frac{\log_a(n^b)}{n^b}\le\frac{\log_a(\lfloor n^b\rfloor)}{\lfloor n^b\rfloor}$ – Anne Bauval Sep 05 '23 at 20:01
  • This was if $a>1,$ but easy to modify if $a<1.$ – Anne Bauval Sep 05 '23 at 20:34
  • 1
    Apply $\log_ax=\log x/\log a$ to get rid of the parameter $a.$ – Ryszard Szwarc Sep 05 '23 at 21:05
  • @AnneBauval I think I get it now, thank you very much. The only thing that remains unclear to me is how to prove that $\frac{\log(n)}{n}$ is monotone decreasing for $n>e$ without using derivatives. Anyway, if you would write your comments as an answer, I would gladly accept it. – lorenzo Sep 05 '23 at 21:24
  • Ok, I shall edit an answer. Why do you want to avoid derivatives? Do you allow $\log(1+u)\le u$ instead? – Anne Bauval Sep 06 '23 at 07:20
  • @AnneBauval Because I am trying to use only the tools in the corresponding chapter to prove the result (and derivatives appear approximately only one hundred pages later in the book). Anyway, that is not a big deal. I can use that $\log(1+u)\leq u$ for a small enough $u$; I believe I have proved this in the first part of my question above. Many thanks for your help. – lorenzo Sep 06 '23 at 08:10
  • You are welcome! I do not see where you proved it, even only for $u$ small. Anyway, I think your first part is a bit "cheating": it relies on $\lim_{n\to\infty}\sqrt[n]n=1,$ i.e. $\lim_{n\to\infty}e^{(\log n)/n}=1,$ which (by continuity of $\log$ and $\exp$) is directly equivalent to $\lim_{n\to\infty}(\log n)/n=0.$ – Anne Bauval Sep 06 '23 at 09:15
  • 1
    @AnneBauval That $\lim\limits_{n\to\infty} \sqrt[n]{n}=1$ can be proved by AM-GM inequality applied to the numbers $a_1=\dots=a_{n-2}=1, a_{n-1}=a_n=\sqrt{n}$: $\sqrt[n]{n}=\left(\prod\limits_{k=1}^{n}a_k\right)^{\frac{1}{n}}<\frac{1}{n}\sum\limits_{k=1}^{n}a_k=1-\frac{2}{n}+\frac{2}{\sqrt{n}}<1+\frac{2}{\sqrt{n}}$ so $\sqrt[n]{n}<1+\varepsilon$ for $n>\frac{4}{\varepsilon^2}$. – lorenzo Sep 06 '23 at 09:19
  • Fine! I retract my criticism, then. – Anne Bauval Sep 06 '23 at 09:23

1 Answers1

1

As suggested by @RyszardSzwarc in comment, we can first get rid of the parameter $a,$ applying $\log_ax=\log x/\log a.$

What we must prove now ($\forall b>0$) is $\lim_{n\to\infty}\frac{\log n}{n^b}=0,$ knowing that $\lim_{n\to\infty}\frac{\log n}n=0.$ Equivalently, since $\log n=\frac1b\log(n^b),$ we must prove that $\lim_{n\to\infty}\frac{\log\left(n^b\right)}{n^b}=0.$

For this, we shall use that the function $x\mapsto\frac{\log x}x$ is decreasing for $x\ge e$ (see a proof of it at the end of this answer), and that as $n$ tends to $+\infty,$ so does $n^b,$ hence also its integer part $\lfloor n^b\rfloor.$

For $n$ large enough, $\lfloor n^b\rfloor\ge e,$ and therefore $$0<\frac{\log n}{n^b}\le\frac{\log\lfloor n^b\rfloor}{\lfloor n^b\rfloor}.$$ Since $\left(\frac{\log\lfloor n^b\rfloor}{\lfloor n^b\rfloor}\right)$ is a subsequence of $\left(\frac{\log n}n\right),$ it converges to $0$ by assumption, and by the squeeze theorem, the conclusion follows.

To end up, let us prove as promised that $x\mapsto\frac{\log x}x$ is decreasing for $x\ge e.$ The simplest way would be to notice that its derivative is negative, but if you want to avoid derivatives, here is a proof relying "only" on the fact that $\forall u\ge0,\quad\log(1+u)=\int_1^{1+u}\frac{dt}t\le\int_1^{1+u}dt=u:$

If $e\le x\le y,$ write $y=\left(1+u\right)x.$ Then, $$\frac{\log y}{\log x}=1+\frac{\log(1+u)}{\log x}\le1+\log(1+u)\le1+u=\frac yx,$$ hence $\frac{\log y}y\le\frac{\log x}x.$

Anne Bauval
  • 34,650