5

A interesting problem: why is it that as $k$ approach infinity, $$\frac{\log_{e} k}{k}$$ approaches $0$?

Another problem: why $e^n=(1+n/x)^x$ as $x$ approach infinity?

misi
  • 997
  • 1
    Because polynomial growth is faster than logarithmic growth... – The Chaz 2.0 Jul 21 '11 at 18:59
  • @The Chaz-is there any reference i could look at? –  Jul 21 '11 at 19:02
  • http://en.wikipedia.org/wiki/Big_O_notation , although I'm sure others have better resources! – The Chaz 2.0 Jul 21 '11 at 19:07
  • 4
    For your second question: that depends on how you define the exponential $e$! One can define $e^n$ as that limit, in which case the answer is "by definition"; or one can define the exponential in a different way, in which case the equality would be derived. But without knowing how you define the exponential, it is impossible to give a good answer to your second question. – Arturo Magidin Jul 21 '11 at 19:11
  • The logarithm (the base does not matter) grows slowly, as compared with other elementary (increasing) functions (notice also $\log(k)/\sqrt{k}$ tends to zero). Informally, the logarithm almost "always looses"; as, conversely, the exponential almost always wins. – leonbloy Jul 21 '11 at 19:31
  • The first question has already been dealt with here on MSE at this location: http://math.stackexchange.com/questions/55468/how-to-prove-that-exponential-grows-faster-than-polynomial/55492#55492 – Mike Jones Aug 17 '11 at 21:18
  • @leonbloy s/looses/loses/g – Soham Chowdhury Apr 07 '13 at 10:17

7 Answers7

6

First.

You can also understand it by graph.

enter image description here

let

$$f(k)=k,\ g(k)=\ln k$$

then

$$f'(k)=1,\ g'(k)=\frac{1}{k}$$

If $k>1$,

$$f'(k)>g'(k),\ \lim_{k\to \infty}f'(k)=1>\lim_{k\to \infty}g'(k)=0$$

notice that

$$f(1)>g(1)$$

so

$$\lim_{k\to\infty}\frac{g(k)}{f(k)}=0$$

Second.

notice that

$$e=\lim_{x\to\infty}\left (1+\frac{1}{x}\right )^x$$

if $1/n$ is constant, we know

$$e=\lim_{x\to\infty}\left (1+\frac{n}{x}\right )^{\frac{x}{n}}$$

so

$$e^n=\lim_{x\to\infty}\left (1+\frac{n}{x}\right )^x$$

Important thing

$$e=\lim_{x\to\infty}\left (1+\frac{\bigstar}{x}\right )^\frac{x}{\bigstar}$$

$$\frac{\bigstar}{x}\times\frac{x}{\bigstar}=1$$

Xiang
  • 677
4

(For the first question.)

For any fixed $x > 0$, it follows from the binomial formula $$ (1 + x)^k = {k \choose 0}x^0 + {k \choose 1}x^1 + {k \choose 2}x^2 + \cdots + {k \choose k-1}x^{k - 1} + {k \choose k}x^k $$ that $$ (1 + x)^k > \frac{{k(k - 1)}}{2}x^2 > k, $$ for all $k$ greater than some positive integer $K$. Hence also $$ \ln (1 + x)^k > \ln k \;\; \forall k > K $$ and, in turn, $$ k \ln(1+x) > \ln k \;\; \forall k > K. $$ Thus, $$ 0 < \frac{{\ln k}}{k} < \ln (1 + x) \;\; \forall k > K. $$ Noting that $\varepsilon := \ln(1+x)$ is an arbitrary positive number, by definition of limit we have $$ \mathop {\lim }\limits_{k \to \infty } \frac{{\ln k}}{k} = 0. $$

EDIT:

Noting that $$ \ln n^{1/n} = \frac{{\ln n}}{n}, $$ any proof of $$ \mathop {\lim }\limits_{n \to \infty } n^{1/n} = 1 $$ can serve as a proof of $$ \mathop {\lim }\limits_{n \to \infty } \frac{{\ln n}}{n} = 0. $$ Here you can find an elegant proof of $\mathop {\lim }\nolimits_{n \to \infty } n^{1/n} = 1$, using the AM-GM inequality.

Shai Covo
  • 24,077
3
  1. Application of l'Hôpital's rule $$\lim_{k\rightarrow \infty }\frac{\log k}{k}=\frac{\lim_{k\rightarrow \infty }\left( \frac{d}{dk}\log k\right) }{\lim_{k\rightarrow \infty }\left( \frac{% dk}{dk}\right) }=\frac{\lim_{k\rightarrow \infty }\frac{1}{k}}{% \lim_{k\rightarrow \infty }1}=\frac{0}{1}=0.$$ Alternatively, if one changes variables, using the substitution recommended by Theo Buehler $k=e^l\rightarrow\infty $ as $l$ tends to $\infty$, the limit can be evaluated observing that for $k\ge 1$, $e^l\ge\dfrac{l^2}{2}$, one has $$0\le\frac{\log k}{k}=\frac{l}{e^l}\le\frac{2l}{l^2}=\frac{2}{l}.$$ Applying limits one gets by the squeeze theorem $\lim_{k\rightarrow \infty }\frac{\log k}{k}=0$. However, I am not able to show the inequality without the Taylor series for the exponential (see Steven Stadnicki's comment).

  2. By definition of $e=\lim_{k\rightarrow \infty }\left( 1+\frac{1}{k}% \right) ^{k}$

$$\begin{eqnarray*} \lim_{x\rightarrow \infty }\left( 1+\frac{n}{x}\right) ^{x} &=&\lim_{x\rightarrow \infty }\left( 1+\frac{1}{\frac{x}{n}}\right) ^{x} \\ &=&\lim_{x\rightarrow \infty }\left( \left( 1+\frac{1}{\frac{x}{n}}\right) ^{% \frac{x}{n}}\right) ^{n} \\ &=&\left( \lim_{x\rightarrow \infty }\left( 1+\frac{1}{\frac{x}{n}}\right) ^{% \frac{x}{n}}\right) ^{n} \\ &=&(e)^{n}=e^{n}. \end{eqnarray*}$$

  • 2
    L'Hôpital seems a bit of an overkill for 1. I'd substitute $k = e^{l}$ and use that $e^{l} \geq l^2 / 2$, for example. – t.b. Jul 21 '11 at 19:33
  • @Theo Buehler: Yes, it is. I will use your substitution. – Américo Tavares Jul 21 '11 at 19:39
  • Doesn't one still need to show the exponential inequality? That's obvious if you know the Taylor series for the exponential, but the phrasing of the question seems to suggest that it's below that level... – Steven Stadnicki Jul 21 '11 at 20:37
  • @Steven Stadnicki: You are right. I am not able to show it without the Taylor series for the exponential. – Américo Tavares Jul 22 '11 at 09:22
3

Another straightforward approach is through integrals. Since $x^{-1} \lt x^{t-1}$ for any $t\gt 0$ and $x\gt 1$, $\int_1^k x^{-1}\mathrm{d}x\lt \int_1^k x^{t-1}\mathrm{d}x$; $\log(k)\lt {k^t-1\over t}$. But choosing $t={1\over2}$ here gives $\log(k)\lt 2(k^{1/2}-1)$ and ${\log(k)\over k}\lt 2(k^{-1/2}-k^{-1})$, and the latter obviously goes to $0$ as $k\rightarrow\infty$. Note that you can easily adapt this to show that $\lim_{k\rightarrow\infty}(\log k/ k^\epsilon) = 0$ for any $\epsilon\gt 0$.

1

Since you get the indeterminate form $\infty / \infty$ as $k\to\infty$, you can use l'Hopital's rule:

$$\lim_{k\to\infty} \frac{\ln k}{k} = \lim_{k\to\infty} \frac{1/k}{1} = 0$$

Jeff P
  • 189
1

Every time $k$ gets multiplied by $e$, $\log_e k$ increases by $1$. So $\frac{\log_e k}{k}$ is replaced by $$ \frac{1+\log_e k}{ek} $$ and this is less than half of $\frac{\log_e k}{k}$ when $k$ is bigger than....17, I think.

So every time $k$ gets multiplied by $e$, the expression gets cut down to less than half what it was before. Hence it must approach 0.

0

For the second question, first note that $(1+n/x)^x = \exp(x \log(1+n/x))$. Second, observe that the graph of the natural logarithm function $\log(x)$ is tangent to $x-1$ at $x=1$, so that $\log(1+\epsilon) = \epsilon + \mathcal O(\epsilon^2)$. Thus

$$\left(1+\frac n x\right)^x = \exp\left(x \log\left(1+\frac n x\right)\right) = \exp\left(x\left(\frac n x + \mathcal O\left(\frac 1 {x^2}\right)\right)\right) = \exp\left(n + \mathcal O\left(\frac 1 x\right)\right),$$

which tends to $\exp(n)$ as $x$ tends to $\infty$.

Not quite coincidentally, the approximation $\log(1+\epsilon) \approx \epsilon$ for $|\epsilon| \ll 1$ and the other techniques demonstrated above come handy in all sorts of calculations dealing with probabilities of unlikely events and their complements. It's also useful to note that the approximation gives a strict upper bound: $\log(1+\epsilon) \le \epsilon$, and the equality only holds when $\epsilon = 0 $.