8

The usual proof for the convergence of a geometric series of ratio $C: |C|\in [0,1)$ makes use of the formula $$\sum_{0\leq k \leq n} C^k = \frac{1-C^{n+1}}{1-C}.$$

I'm looking for alternative ways to prove it. The motivation for this is that, if someone who never saw this formula tried to prove the geometric series converges might have a hard time, unless maybe there are other, perhaps more insightful ways to prove it.

Carla_
  • 423
  • You can test convergence with the ratio test. – Giorgos Giapitzakis Jul 02 '21 at 20:14
  • @GiorgosGiapitzakis How do you prove the ratio test without first proving convergence of geometric series? – Carla_ Jul 02 '21 at 20:15
  • You're right. I don't think you can prove it without the geometric series – Giorgos Giapitzakis Jul 02 '21 at 20:17
  • 3
    It's a tricky brief. Obviously using ratio or root test would be completely circular. Establishing an upper bound (and using monotone convergence theorem) is tricky, because the sum tends to $\infty$ as $C \to 1^-$. Using one of the comparison tests requires another (presumably more intuitive) series to work with, and they don't get much more intuitive than geometric series by the given argument. Your best hope would be to compare the geometric series with some kind of telescoping series based on $C$, but note that the first term would have to tend to $\infty$ as $C \to 1^-$. – Theo Bendit Jul 02 '21 at 20:23
  • 2
    One could argue that if $s_n$ is the $n$th partial sum, then $s_{n+1} = 1 + Cs_n,$ from which it is clear that the only possible limit is $1/(1 - C).$ Then it is also natural to write $$ s_{n+1} - \frac1{1 - C} = C\left(s_n - \frac1{1 - C}\right). $$ This leads to the formula you're trying to avoid, but at least provides a way to 'discover' it. Also, one could avoid that formula, and argue from the least upper bound axiom that the strictly increasing sequence of negative numbers $s_n - 1/(1 - C)$ has upper bound $0$ (because supposing it $< 0$ leads to a contradiction) therefore limit $0.$ – Calum Gilhooley Jul 02 '21 at 20:41
  • (That should be "least upper bound $0$", of course.) – Calum Gilhooley Jul 02 '21 at 20:50
  • 2
    You could prove $\sum_{n=0}^{\infty}|C|^n$ converges using the integral test where $C\in (-1,1)$ is constant. Then use the fact that an absolutely convergent series is convergent. – Matthew H. Jul 02 '21 at 22:04
  • 2
    Beautiful question. It is actually amazing that almost all of convergence theory depends on the geometric sum – lalala Jul 03 '21 at 08:19

5 Answers5

4

As @TheoBendit suggested, comparison with a telescoping series can work:

For example, the classic telescoping series $$ \sum_{0\le n\le N} {1\over n(n+1)} \;=\; \sum_{0\le n\le N} \Big({1\over n}-{1\over n+1}\Big) \;=\; {1\over 1} - {1\over N+1} $$ Similarly, the tails go to zero, so by whatever criterion we like, this converges.

On the other hand, for any $|C|<1$, for large-enough $n$, $|C^n|\le {1\over n(n+1}$. So, disregarding finitely-many terms, the telescoping series dominates the (convergent) geometric series.

paul garrett
  • 52,465
4

For $C \in [0,1)\,$ the sequence is monotonically increasing and bounded above, thus convergent.

  • Increasing: $\;s_n = \sum_{k=0}^n C^k = s_{n-1} +C^n \ge s_{n-1}\,$.

  • Bounded: $\;s_n= 1 + C\,s_{n-1}\,$, then assuming $s_{n-1} \le L \implies s_n \le 1 + C\,L$. The latter implies $s_n \le L$ when $1 + C\,L \le L \iff L \ge \frac{1}{1-C}\,$, so in particular $\frac{1}{1-C}$ is an upper bound.

When $C \in (-1,0)$ the partial sum $\sum_{0\leq k \leq n} C^k = \sum_{0\leq 2k \leq n} \left(C^2\right)^k + C\sum_{0\leq 2k + 1 \leq n} \left(C^2\right)^k$ where $C^2 \in (0,1)$ so the problem reduces to the first case.

dxiv
  • 76,497
2

If you can show it for a single $C$ (ex:$1/2$ using Zeno’s argument), then you can choose $n$ so that $c^n<1/2$, break the sequence into $n$ sequences of every $n$th value. Each of these sequences converges by comparing with the $1/2$ sequence. There’s a finite number of them, so by interleaving them, the whole sequence converges.

Eric
  • 6,348
1

Natural Derivation of the Closed Form

The first way of handling geometric series I saw, was to derive the closed form for the finite sum. Discovering this derivation seems pretty reasonable if one stares at the geometric sum long enough: $$ \begin{align} S&=1+a+a^2+\dots+a^{n-1}\tag{1a}\\ aS&=\phantom{1+{}}a+a^2+\dots+a^{n-1}+a^n\tag{1b}\\ (1-a)S&=1\phantom{{}+a+a^2+\dots+a^{n-1}}-a^n\tag{1c}\\ S&=\left.\left(1-a^n\right)\middle/(1-a)\right.\tag{1d} \end{align} $$ Explanation:
$\text{(1a)}$: write out the summation
$\text{(1b)}$: multiply by $a$
$\text{(1c)}$: subtract $\text{(1b)}$ from $\text{(1a)}$
$\text{(1d)}$: divide by $1-a$


Euler-Maclaurin Sum Formula

The Euler-Maclaurin Sum Formula can also be applied. If we ignore the error term, the Euler-Maclaurin Sum Formula becomes $\frac{D}{1-e^{-D}}D^{-1}f$, where $D^{-1}$ is indefinite integration (which introduces the constant that appears in the formula). This is discussed a bit in this answer.

Let $a\gt0$ and $a\ne1$. Note that $D^{-1}a^x=\frac{a^x}{\log(a)}+C$. For non-negative $k\in\mathbb{Z}$, $D^ka^x=\log(a)^ka^x$. $$ \begin{align} \frac{D}{1-e^{-D}}D^{-1}a^x &=\frac{D}{1-e^{-D}}\left(\frac{a^x}{\log(a)}+C\right)\tag{2a}\\[3pt] &=\frac{\log(a)}{1-\frac1a}\frac{a^x}{\log(a)}+C\tag{2b}\\ &=\frac{a^{x+1}-1}{a-1}\tag{2c} \end{align} $$ Explanation:
$\text{(2a)}$: apply $D^{-1}$
$\text{(2b)}$: each application of $D$ simply introduces a factor of $\log(a)$
$\phantom{\text{(2b):}}$ thus, an application of $f(D)$ simply multiplies by $f(\log(a))$
$\text{(2c)}$: simplify and set $C=-\frac1{a-1}$ to match $\sum\limits_{k=0}^xa^k$ at $x=0$

For $a\in\left(e^{-2\pi},e^{2\pi}\right)\setminus\{1\}$, the series given by the Euler-Maclaurin Sum Formula actually converges, rather than merely giving an asymptotic expansion.

I am not proposing the Euler-Maclaurin Sum Formula as a way to initially approach the Geometric Sum Formula. I present it more as a point of interest. A demonstration of analyzing the series as one might other, more complicated, series.

robjohn
  • 345,667
0

If $c = \dfrac1{1+b}$ where $b > 0$ then, since, for $k \ge 2$, $(1+b)^k \ge 1+bk+b^2k(k-1)/2 \gt b^2k(k-1)/2$ (readily proved by induction), so $c^k = \dfrac1{(1+b)^k} \le \dfrac{2}{b^2k(k-1)} $ so

$\begin{array}\\ \sum_{k=2}^n c^k &\le \sum_{k=2}^n\dfrac{2}{b^2k(k-1)}\\ &= \dfrac{2}{b^2}\sum_{k=2}^n\dfrac1{k(k-1)}\\ &= \dfrac{2}{b^2}\sum_{k=2}^n(\dfrac1{k-1}-\dfrac1{k})\\ &= \dfrac{2}{b^2}(1-\dfrac1{n})\\ &< \dfrac{2}{b^2}\\ &= \dfrac{2}{(\frac1{c}-1)^2}\\ &= \dfrac{2c^2}{(1-c)^2}\\ &\text{so}\\ \sum_{k=0}^n c^k &< 1+c+\dfrac{2c^2}{(1-c)^2}\\ &= \dfrac{c^3 + c^2 - c + 1}{(1-c)^2}\\ \end{array} $

marty cohen
  • 107,799