2

To prove $\lim\limits_{n\to\infty} \left(1+\frac{x}{n}\right)^{n}$ exists, we prove that the sequence $$f_n=\left(1+\frac{x}{n}\right)^n$$ is bounded and monotonically increasing toward that bound.

Proof Attempt:


We begin by showing $f_n=\left(1+\frac{x}{n}\right)^n$ is monotonically increasing by looking at the ratio of consecutive terms: \begin{align*} \frac{f_{n+1}}{f_n} &=\frac{\left(1+\frac{x}{n+1}\right)^{n+1}}{\left(1+\frac{x}{n}\right)^{n}} \tag{Definition of $f_n$} \\ &=\frac{\left(1+\frac{x}{n+1}\right)^{n+1}\left(1+\frac{x}{n}\right)}{\left(1+\frac{x}{n}\right)^{n}\left(1+\frac{x}{n}\right)} \tag{Multiplication by $\frac{\left(1+\frac{x}{n}\right)}{\left(1+\frac{x}{n}\right)}$} \\ &=\frac{\left(1+\frac{x}{n+1}\right)^{n+1}}{\left(1+\frac{x}{n}\right)^{n+1}}\left(1+\frac{x}{n}\right) \tag{Simplify $a^n\cdot a = a^{n+1}$} \\ &=\left(\frac{1+\frac{x}{n+1}}{1+\frac{x}{n}}\right)^{n+1}\left(1+\frac{x}{n}\right) \tag{Simplify $\frac{a^{n+1}}{b^{n+1}}=\left(\frac{a}{b}\right)^{n+1}$} \\ &=\left(\frac{\frac{n+1+x}{n+1}}{\frac{n+x}{n}}\right)^{n+1}\left(1+\frac{x}{n}\right) \tag{Common denominators} \\ &=\left(\frac{n+1+x}{n+1}\cdot \frac{n}{n+x}\right)^{n+1}\left(1+\frac{x}{n}\right) \tag{Simplify $\frac{\frac{a}{b}}{\frac{c}{d}}=\frac{a}{b}\cdot \frac{d}{c}$} \\ &=\left(\frac{n^2+n+nx}{(n+1)(n+x)}\right)^{n+1}\left(1+\frac{x}{n}\right) \tag{Distribute $(n+1+x)n$} \\ &=\left(\frac{n^2+n+nx+x-x}{(n+1)(n+x)}\right)^{n+1}\left(1+\frac{x}{n}\right) \tag{Add and subtract $x$} \\ &=\left(\frac{(n+1)(n+x)-x}{(n+1)(n+x)}\right)^{n+1}\left(1+\frac{x}{n}\right) \tag{Factor $n^2+n+nx+x$} \\ &=\left(1+\frac{-x}{(n+1)(n+x)}\right)^{n+1}\left(1+\frac{x}{n}\right) \tag{Simplify $\frac{a+b}{c}=\frac{a}{c}+\frac{b}{c}$} \\ &\ge\left(1+\frac{-x}{(n+x)}\right)\left(1+\frac{x}{n}\right) \tag{Bernoulli: $(1+x)^n \ge 1+nx$} \\ &=\left(\frac{n}{n+x}\right)\left(\frac{n+x}{n}\right) \tag{Common denominators} \\ &=1 \tag{Simplify $\frac{a}{b} \cdot \frac{b}{a}=1$} \end{align*} Since $\frac{f_{n+1}}{f_n}>1$, then $f_{n+1}>f_n$, which shows the sequence $f_n$ is monotonically increasing for all $n \in \mathbb{N}$.

Next, we show $f_n=\left(1+\frac{x}{n}\right)^n$ is bounded above. Note that \begin{align*} f_n &=\left(1+\frac{x}{n}\right)^n \tag{Definition of $f_n$} \\ &=\sum_{k=0}^n \binom{n}{k} (1)^{n-k} \left(\frac{x}{n}\right)^{k} \tag{Binomial Theorem} \\ &=1+\frac{n}{1!}\left(\frac{x}{n}\right)+\frac{n(n-1)}{2!}\left(\frac{x}{n}\right)^2+\frac{n(n-1)(n-2)}{3!}\left(\frac{x}{n}\right)^3+\cdots+\frac{n!}{n!}\left(\frac{x}{n}\right)^n \\ &=1+\frac{\frac{n}{n}}{1!}x+\frac{\frac{n(n-1)}{n^2}}{2!}x^2+\frac{\frac{n(n-1)(n-2)}{n^3}}{3!}x^3+\cdots+\frac{\frac{n!}{n^n}}{n!}x^n \tag{Simplify}\\ &=1+\frac{1}{1!}x+\frac{\left(1-\frac{1}{n}\right)}{2!}x^2+\frac{\left(1-\frac{1}{n}\right)\left(1-\frac{2}{n}\right)}{3!}x^3+\cdots+\frac{\left(1-\frac{1}{n}\right)\left(1-\frac{2}{n}\right)\cdots \left(1-\frac{n-1}{n}\right)}{n!}x^n \\ & \le 1+\frac{1}{1!}x+\frac{1}{2!}x^2+\frac{1}{3!}x^3+\cdots+\frac{1}{n!}x^n \tag{$1-\frac{k}{n}<1$} \\ & = \sum_{k=0}^n \frac{1}{k!} x^k \tag{Sigma notation}\\ & \to %\underset{n \to \infty}{\to} \sum_{k=0}^\infty \frac{1}{k!}x^k \tag{as $n \to \infty$} \\ & = \sum_{k=0}^K \frac{1}{k!} x^k + \sum_{k=K+1}^\infty \frac{1}{k!}x^k \tag{$\exists K$, $k>K$ implies $k! \ge (2x)^k$}\\ & \le \sum_{k=0}^K \frac{1}{k!} x^k + \sum_{k=K+1}^\infty \frac{1}{(2x)^k} x^k \tag{$k! \ge (2x)^k$ implies $\frac{1}{k!} \le \frac{1}{(2x)^k}$}\\ & = \sum_{k=0}^K \frac{1}{k!} x^k + \sum_{k=K+1}^\infty \frac{1}{2^k} \tag{$\frac{1}{(2x)^k}x^k=\frac{1}{2^k x^k}x^k = \frac{1}{2^k}$}\\ &= \sum_{k=0}^K \frac{1}{k!} x^k + \frac{1}{2^K} \tag{Geometric series evaluation} \end{align*} which is finite. Thus, the sequence $f_n$ is bounded. Since it is both monotonically increasing and bounded, it is convergent by the Monotone convergence theorem.


Is my proof correct? I am suspicious of the step which says "$\rightarrow \sum_{k=0}^n \frac{1}{k!}x^k$", and would like to avoid taking another limit in the middle of the boundedness proof.

I also compared my proof to the following references and saw something worrisome:

All of the above proofs either assumed $x>0$ or considered cases where $x>0$ and $x<0$ separately, but I do not know why. In fact, the third reference considers $\left(1-\frac{x}{n}\right)^{-n}$ for $x>0$ (I think this is a typo and should read $x<0$), but I am not sure why the negative exponent is needed (we are talking about a negative value of $x$, not negative $n$.)

I could only find one proof that did not consider different cases on the sign of $x$:

  • Bonus reference 4 <- Uses absolute values, but I am not sure why these are necessary either.

I would like to verify my proof and ask 3 questions:

  1. Why is it necessary to consider cases $x>0$ and $x<0$ separately? Did any step in my proof implicitly assume that $x>0$? If so, which one?

  2. Is there any way to avoid taking a limit in the middle of the boundedness proof?

  3. Substituting $n=1$ in my boundedness proof shows $1+x \le \sum_{k=0}^n \frac{1}{k!}x^k$. Does this imply $1+x \le \lim\limits_{n\to \infty} \left(1+\frac{x}{n}\right)^n$, since $f_n$ is an increasing function of $n$? Can this be seen explicitly, or would that require a separate proof?

Thank you.

EthanAlvaree
  • 3,412
  • There is no typo in reference 3 about $(1-(x/n))^{-n}$. Here $x>0$ and it is related to the expression in your question in a helpful way. You can also try to study the alternative approach mentioned in my blog post (reference 3) which works by first dealing with $x=1$. – Paramanand Singh Aug 25 '21 at 02:05
  • Also note that if $n>|x|$ then Bernoulli inequality shows $(1+(x/n))^n\geq 1+x$ and then the limit is also $\geq 1+x$. – Paramanand Singh Aug 25 '21 at 02:06
  • Thanks, Paramanand. Your blog has been super helpful, and I wanted to reach out to privately but couldn't find a contact email. I have read the blog several times and while I understand the case for $x>0$, I am having trouble understanding the case for $x<0$. Could you please explain more about why we need to consider $\left(1-\frac{x}{n}\right)^{-n}$? Also, what step in my proof implicitly assumes positive $x$, or does my proof hold for all $x \in \mathbb{R}$ without need for cases? – EthanAlvaree Aug 25 '21 at 02:14
  • Well the proof that sequence increases does not depend on sign of $x$. You have got it right in your question. But for boundedness the proof breaks down when you get the series $\sum_{k=0}^n x^k/k! $. That step needs $x\geq 0$. For $x<0$ the idea is much simpler. If $x<0$ and $n>|x|$ we have $0<(1+(x/n))<1$ and hence $(1+(x/n))^ n<1$ and we get the boundedness easily for $x<0$. – Paramanand Singh Aug 25 '21 at 02:22
  • Also remember that for $x<0$ the increasing nature and boundedness are valid only when $n>|x|$ so you may say that the sequence starts to behave nicely after a certain point and not directly from $n=1$. – Paramanand Singh Aug 25 '21 at 02:25
  • If you want to discuss this in more detail you may use chatrooms. Contact via email is not of much help because we don't have mathjax in mail. – Paramanand Singh Aug 25 '21 at 02:29
  • Use https://chat.stackexchange.com/rooms/128918/room-for-discussion-with-ethanalvaree if needed. – Paramanand Singh Aug 25 '21 at 02:40
  • Can you explain what is the error in the $\sum_{k=0}^n \frac{1}{k!} x^k$ when $x<0$? – EthanAlvaree Aug 25 '21 at 19:47
  • The $(1+\frac{x}{n})<1$ when $x<0$ argument makes perfect sense to me. It is a very simple argument. So why is it necessary to consider $\left(1-\frac{x}{n}\right)^{-n}$ as in reference 3? I do not follow this argument. – EthanAlvaree Aug 25 '21 at 19:49
  • Please feel free to post an answer to this question. I am opposed to using a chatroom, since chats have a tendency to be deleted over time. I would like a permanent reference. – EthanAlvaree Aug 25 '21 at 19:51
  • In you proof, the issue is about how Bernoulli's inequality is applied. For any positive integer $n$, $(1+x)^n\geq 1+nx$ if $x\geq-1$. One can see that by indiction arguments. If $x<-1$ things may go wrong: $x=-4$ and $n=3$ for example, yields $-27\leq -11$. – Mittens Aug 26 '21 at 02:27

2 Answers2

1

Let's observe that $$\left(1+\frac{x}{n}\right) ^n=1+x+\sum_{k=2}^n\frac{x^k}{k!}\left(1-\frac{1}{n}\right)\dots\left(1-\frac{k-1}{n}\right)=\sum_{k=0}^{n}a_k\frac{x^k}{k!}\tag{1}$$ where $$a_0=a_1=1,a_k=\left(1-\frac{1}{n}\right)\dots\left(1-\frac{k-1}{n}\right),k=2,\dots,n$$ Note that the coefficients $a_k$ are positive and do not exceed $1$. If $x>0$ then we can use the implication $$a_k\leq 1\implies a_kx^k\leq x^k\tag{2}$$ to bound the sum in $(1)$ with $\sum_{k=0}^n x^k/k! $. But if $x<0$ the implication $(2)$ does not hold for odd values of $k$ (rather the inequality gets reversed) and hence we can't find an upper bound.

Next I explain the argument used in reference 3 (my blog). Let $$F(x, n) =\left(1+\frac{x}{n}\right)^n,G(x,n)=\left(1-\frac{x}{n}\right)^{-n}\tag{3}$$ then we have $$F(-x, n) G(x, n) =1\tag{4}$$ Let $x>0$ and then we have already established (as in my blog or your question) that $F(x, n) $ is increasing (as function of $n$) and also bounded above and hence the limit $\lim_{n\to\infty} F(x, n) $ exists.

To handle negative values of $x$ I treat the expression $F(-x, n) $ with $x>0$. A better approach would have been to assume $x<0,x=-y,y>0$ and focus on $F(-y, n) $ but I have reused the symbol $x$ instead of inventing another symbol $y$. Thus I keep $x>0$ and handle both $F(x, n), F(-x, n) $.

Now as in reference 3 we have $x>0$ and $n>x$ so that we can apply general binomial theorem to write an infinite power series (in $x$) for $G(x, n) $ as $$G(x, n) =\sum_{k=0}^{\infty}b_k\frac{x^k}{k!}\tag{5}$$ where $$b_0=b_1=1,b_k=\left(1+\frac{1}{n}\right)\dots\left(1+\frac {k-1}{n}\right), k>1$$ Note that $b_k\geq 1$ and if $n$ increases then $b_k$ decreases and thus $G(x, n) $ is a decreasing sequence and bounded below by $\sum_{k=0}^{\infty}x^k/k!$ and hence bounded below by $1+x$. Thus $G(x, n) $ tends to a limit which is not less than $1+x$ (so that the limit is positive). Then $F(-x, n) =1/G(x,n)$ also tends to a positive limit.

Your own approach is smarter because it can show the increasing nature of $F(x, n) $ for all $x$ using Bernoulli inequality whereas I had to invent a $G(x, n) $ to deal with some cases.

  • Hi Paramanand, I understand the mistake $$a_k \le 1 \implies a_k x^k \le x^k$$ is true only if $x >0$. But would it be correct to say $$a_k \le 1 \implies a_k x^k \le |x^k|$$ and bound our series using absolute values? I think that is the approach of this proof: https://proofwiki.org/wiki/Exponential_Function_is_Well-Defined/Real/Proof_2 – EthanAlvaree Aug 30 '21 at 21:28
1
  • You application of Bernoulli's inequality is correct as long as $y_n=-\frac{x}{(n+x)(n+1)}\geq-1$. In reference (2) which you mention in your posting, Mark Viola assumes $x> -1$, in which case $y_n>-1$ for all $n\geq1$ and so, the sequence $f_n$ (in your notation) is monotone increasing for all $n\in\mathbb{N}$. However, he also points out that for any $x$, one can consider the first $n_0\in\mathbb{N}$ so that $n_0+x>0$. Then $y_n>-1$ for all $n\geq n_0$ and so, the sequence $f_n$ is monotone increasing for all for $n\geq n_0$. This shows that the limit exists for all $x\in\mathbb{R}$.

  • An alternative approach shows that it is enough to consider the existence of the limit $e=\lim_n(1+\tfrac1n)^n$ to show that $\lim_n(1+\tfrac{a}{n})^n$ exists and equals $e^a$ for all $a\in\mathbb{R}$.

Indeed, from the existence of $e=\lim_n(1+\tfrac1n)^n$, we have that
$\lim_{x\rightarrow\infty}(1+\tfrac{1}{x})^x$ ($x$ ranges along positive reals) exists and has limit $e$. To see this, notice that $n_x:=\lfloor x\rfloor\leq x\leq \lfloor x\rfloor +1 = n_x+1$ hence $$\big(1+\tfrac{1}{n_x+1}\big)^{n_x}\leq\big(1+\tfrac1x\big)^{n_x}\leq\big(1+\tfrac1x\big)^x\leq\big(1+\tfrac{1}{n_x}\big)^x\leq\big(1+\tfrac{1}{n_x}\big)^{n_x+1} $$ As $n_x\xrightarrow{x\rightarrow\infty}\infty$, one gets that $$ \lim_{n_x\rightarrow\infty}\big(1+\tfrac{1}{n_x+1}\big)^{n_x}=\lim_{n_x\rightarrow\infty}\frac{\big(1+\tfrac{1}{n_x+1}\big)^{n_x+1}}{1+\tfrac{1}{n_x+1}}=e $$ and $$ \lim_{n_x\rightarrow\infty}\big(1+\tfrac{1}{n_x}\big)^{n_x+1}=\lim_{n_x\rightarrow\infty}\big(1+\tfrac{1}{n_x}\big)^{n_x}\big(1+\tfrac{1}{n_x+1}\big)=e $$ Therefore $\lim_{x\rightarrow\infty}\big(1+\tfrac1x\big)^x=e$. As a consequence, for $a>0$, we have $$\lim_{n\rightarrow\infty}\big(1+\tfrac{a}{n}\big)^n=\lim_{n\rightarrow\infty}\left(\big(1+\tfrac{a}{n}\big)^{\tfrac{n}{a}}\right)^a=e^a $$ Similarly, for $x<0$, let $y=-x$ so that $y>0$. $$\big(1+\tfrac{1}{x}\big)^x=\big(1-\tfrac1y\big)^{-y}=\left(\frac{1}{1-\frac1y}\right)^y=\left(1+\tfrac{\tfrac1y}{1-\tfrac1y}\right)^y=\left(1+\tfrac{1}{y-1}\right)^{y-1}\big(1+\tfrac{1}{y-1}\big)$$ Thus, $$\lim_{x\rightarrow-\infty}\big(1+\tfrac1x\big)^x=\lim_{y\rightarrow\infty}\left(1+\tfrac{1}{y-1}\right)^{y-1}\big(1+\tfrac{1}{y-1}\big)=\frac1e$$ As a consequence, for $a>0$ $$\lim_{n\rightarrow\infty}\big(1-\tfrac{a}{n}\big)^n=\lim_{n\rightarrow\infty}\left(\big(1-\tfrac{a}{n}\big)^{-\tfrac{n}{a}}\right)^a=e^{-a}$$

  • Notice that all these requires a definition of real powers that is a rigorous definition of $a^x$ for $a>0$ and $x\in\mathbb{R}$. This of course is done by starting with rational powers, show monotonicity and then use axioms such as that of the supremum or its equivalents. Other modern approaches are based on integration. One defines $\ln(x)=\int^x_1\frac{1}{t}\,dt$ for $x>0$ and shows that $\ln$ is a nice continuous, monotone increasing functions (furthermore, it is differentiable by the fundamental theorem of Calculus) taking values in all of $\mathbb{R}$, with the property that $\log(ab)=\log(a)+\log(b)$ for all $a,b>0$. Then, the exponential function is defined as the inverse of $\ln$. Theorems such as the inverse function theorem, would imply differentiability and the property that $\exp(a+b)=\exp(a)\exp(b)$. The number $e=\exp(1)$.
Mittens
  • 39,145
  • Thank you, Oliver! Do you have any theorem-proof-style references for the real powers approach? I think this is the approach I would like to follow (I would like to avoid the integration approach that defines $\ln(x)$ first). Thanks again! – EthanAlvaree Aug 30 '21 at 21:26
  • There are many sources. In Rudin, W. Principle of Mathematical Analysis, McGraw Hill, 3rd edition, 1974, p10 they construct $1/n$ roots of any positive) real number, through the use of the supremum, and in Exercise 6 of chapter 1 they show how to construct the real powers of real numbers. The approach there is based on Landau, E.G.H, Foundations of Analysis, Chelsea Publishing Co. 1951. A slightly different approach can be found in Kudriatzev, Kurs Matematicheskogo Analyza, Tom 1, Moscow, 1981* where they deduced the existence of fractional powers through continuity and inversion. – Mittens Aug 30 '21 at 22:07