1

We have just started covering the limit of sequences and I've stumbled upon this limit in our uni's excercises:

$$\lim_{n\rightarrow \infty} {\sqrt[n]{\ln(n)}}$$

I've considered solving it using the fact that $\lim_{n\rightarrow \infty} {\sqrt[n]{a}}=1$ for $a>0$. And since we're dealing with natural numbers, with the exception of $n=1$, the expression $\ln(n)$ should be $>0$, right?

So is it correct to assume that $\lim_{n\rightarrow \infty} {\sqrt[n]{\ln(n)}}=1$ using this thought process?

Gary
  • 31,845

5 Answers5

1

To solve this limit, it's helpful to employ L'Hôpital's Rule, but in a form applicable to sequences and involving logarithms, due to the indeterminate form that arises. The form we'll use involves taking the logarithm of the sequence and then applying L'Hôpital's Rule:

  1. First, recognize that directly applying L'Hôpital's Rule to the original sequence isn't straightforward. We need to manipulate the expression into a form that allows us to apply the rule.

  2. Convert the limit into an exponent of $e$ to facilitate the use of L'Hôpital's Rule: $$ \lim _{n \rightarrow \infty} \sqrt[n]{\ln (n)} = \lim _{n \rightarrow \infty} e^{\frac{\ln(\ln(n))}{n}} $$ This step involves understanding that $\sqrt[n]{x} = x^{\frac{1}{n}} = e^{\frac{\ln(x)}{n}}$.

  3. Now, consider the exponent separately: $$ \lim _{n \rightarrow \infty} \frac{\ln(\ln(n))}{n} $$ This limit appears to be of the form $\frac{\infty}{\infty}$ as $n \rightarrow \infty$, allowing us to apply L'Hôpital's Rule.

  4. Apply L'Hôpital's Rule by differentiating the numerator and the denominator separately:

    • The derivative of $\ln(\ln(n))$ with respect to $n$ is $\frac{1}{n\ln(n)}$.
    • The derivative of $n$ with respect to $n$ is $1$.
  5. The new limit to evaluate is: $$ \lim _{n \rightarrow \infty} \frac{1}{n\ln(n)} $$

  6. This new limit clearly approaches $0$ as $n$ approaches infinity, because the denominator grows without bound much faster than the numerator.

  7. Since the exponent of $e$ goes to $0$, the original limit becomes: $$ e^0 = 1 $$

Therefore, the correct evaluation of your limit is indeed $1$, but the reasoning involves understanding the behavior of $\ln(n)$ as $n$ grows and correctly applying L'Hôpital's Rule to an appropriately transformed expression, not directly applying the rule for constants to a variable function. Your initial intuition was close, but the detailed explanation provides a clearer justification.

1

Method of L'H$\hat{\text{o}}$pital rule is introduced above by Zuko. I can state another method, which still require some basic knowledge of calculus. I personally would seldomly use L'H$\hat{\text{o}}$pital rule on limit of sequence as originally this should be a discrete limit instead of continuous stuff, though the result is same.

The calculus result we need to use is $$1-\dfrac{1}{n}=\dfrac{n-1}{n}\le\ln(n)\le n-1$$ This is easy to prove by considering their difference and check the monotonicity and $\cdots$.

With this fact, we have $$\sqrt[n]{1-\dfrac{1}{n}}\le\sqrt[n]{\ln(n)}\le\sqrt[n]{n-1}$$ As $n\to+\infty$, RHS tends to one (can be done by binomial theorem), and LHS tends to one as $1-\dfrac{1}{n}$ is strictly less than one. So any power-rooting a number smaller than one will make it tends to 1. Then by sandwich theorem, we get the required limit is $1$ as $n\to+\infty$.

Angae MT
  • 1,035
  • 1
  • 8
1

Since we have $\dfrac{\ln(n)}n\to 0$

then for any given $\varepsilon>0$ there exists $n_0$ such that $n\ge n_0$ then $\dfrac{\ln(n)}{n}<\varepsilon$

Therefore $1\le\ln(n)\le n\varepsilon\le 1+n\varepsilon\le(1+\varepsilon)^n\quad$ by binomial expansion

And you get taking the the n-root $\quad 1\le\sqrt[n]{\ln(n)}\le 1+\varepsilon$.

zwim
  • 28,563
1

This paper provides a series of upper and lower bounds for the logarithmic function. They are given in the form of Padé approximants and rational approximations (have a look at Table $3$ on page $9$ of the linked paper).

Let use use the second one $$\frac{3 (x-1) (x+1)}{1+4x+x^2} < \log(x) < \frac{(x-1) (x+5)}{2 (1+2 x)}$$ Take logarithms and Taylor expand

$$\log (3)-\frac{4}{x}+O\left(\frac{1}{x^2}\right)< \log(\log(x)) <\log (x)-2 \log (2)+\frac{7}{2 x}+O\left(\frac{1}{x^2}\right)$$ Divide by $x$, exponentiate using $$\sqrt[x]{\log(x)}=e^{\frac{\log(\log(x))} x }$$ and continue with Taylor series $$\color{blue}{1+\frac{\log (3)}{x}+O\left(\frac{1}{x^2}\right)<\sqrt[x]{\log(x)}<1+\frac{\log (x)-2 \log (2)}{x}+O\left(\frac{1}{x^2}\right)}$$

Now, use the squeeze theorem with $x \to \infty$.

Using the seventh set of bounds, we should obtain $$\color{blue}{1+\frac{\log\left(\frac{49}{10}\right)}{x}+O\left(\frac{1}{x^2}\right)<\sqrt[x]{\log(x)}< 1+\frac{\log (x)-\log (36)}{x}+O\left(\frac{1}{x^2}\right)}$$

0

$$\begin{gathered} L = \mathop {\lim }\limits_{n \to \infty } \sqrt[n]{{\ln \left( n \right)}} \Rightarrow \ln \left( L \right) = \mathop {\lim }\limits_{n \to \infty } \frac{{\ln \left( {\ln \left( n \right)} \right)}}{n} \hfill \\ {\text{Now, consider the limit}}:\mathop {\lim }\limits_{n \to \infty } \frac{{\ln \left( {\ln \left( {n + 1} \right)} \right) - \ln \left( {\ln \left( n \right)} \right)}}{{n + 1 - n}} = \mathop {\lim }\limits_{n \to \infty } \left( {\ln \left( {\ln \left( {n + 1} \right)} \right) - \ln \left( {\ln \left( n \right)} \right)} \right) \hfill \\ = \mathop {\lim }\limits_{n \to \infty } \left( {\ln \left( {\frac{{\ln \left( {n + 1} \right)}}{{\ln \left( n \right)}}} \right)} \right){\text{, and the limit}}:\mathop {\lim }\limits_{n \to \infty } \frac{{\ln \left( {n + 1} \right)}}{{\ln \left( n \right)}} = 1 \hfill \\ \Rightarrow \mathop {\lim }\limits_{n \to \infty } \frac{{\ln \left( {\ln \left( {n + 1} \right)} \right) - \ln \left( {\ln \left( n \right)} \right)}}{{n + 1 - n}} = \ln \left( 1 \right) = 0 \hfill \\ \Rightarrow \mathop {\lim }\limits_{n \to \infty } \frac{{\ln \left( {\ln \left( n \right)} \right)}}{n} = 0{\text{ by using Stolz-Cesaro theorem}} \Rightarrow L = {e^0} = 1 \hfill \\ \end{gathered}$$

OnTheWay
  • 2,669