-1

I hear often in calculus classes that for a constant a, $\lim_{x \to 0} a/x = \infty$. But on closer inspection of this limit, I'm not so sure. Yes, of course dividing by zero is not defined. But if $\lim_{x \to 0} a/x = \infty$ does that also mean that

$\lim_{n \to \infty}$$\sum_{0}^n 0 = a$

While I have not taken real analysis, calculus is founded on using limits to describe the infinite. I suppose I'm a little bit shaken on what we are and are not allowed to talk about when describing vertically asymptotic behavior. Or perhaps, by the rules of limits, does $\lim_{n \to \infty}$$\sum_{0}^n 0 $ actually equal a constant?

Dutonic
  • 121
  • 8
    $\sum_{0}^n 0 = 0$ for all $n$ so the limit is also $0$. – geetha290krm Oct 08 '22 at 23:10
  • I am still confused as to how did you even arrive at the idea that $\lim_{x\to 0}\frac a x$ diverges could imply that $\lim_{n\to\infty}\sum_{i=1}^n 0^i = a$? – Seeker Oct 08 '22 at 23:41
  • "$\lim_{x \to 0} a/x = \infty$" is wrong. If you think you heard it in Calculus course, change class immediately (or pay more attention). If $a>0$, $\lim_{x \to 0^+} a/x = \infty$ and $\lim_{x \to 0^-} a/x = -\infty$, so $\lim_{x \to 0} a/x$ does not exist. – Taladris Oct 09 '22 at 00:02
  • 1
    @Taladris I believe they are referring to the limit of the sequence $(\frac a x)_{n=1}^\infty$ and not of the function $f(x)=\frac a x$. And as far as I know, there is no notion of left or right limits of sequences. – Seeker Oct 09 '22 at 00:12
  • @Seeker: I guess you mean $(\frac a n)_{n=1}^\infty$. I don't know. In that case, my comment "Pay more attention" is still valid... – Taladris Oct 09 '22 at 00:18
  • @Taladris Yeah I meant to type $(\frac a n)_{n=1}^\infty$. And I agree that OP should pay more attention to the topics being taught. – Seeker Oct 09 '22 at 00:29
  • @Seeker You guys are all right. This was a stupid question. What I should have asked was "Is a sum of 0+0+0+0... and infinite number of times = 0, and can we prove it?" – Dutonic Oct 09 '22 at 00:37

2 Answers2

0

Continuing from your last comment. Note that the infinite series $\sum_{i=1}^\infty 0^i$ converges to a real number $L$ if the sequence formed by taking the partial sums of the series converges to to $L$. As pointed out in the comments, the partial sums of the infinite series $\sum_{i=1}^n0^i$ forms the sequence $a_n=(0^n)_{n=1}^\infty=0, 0, 0,\dots$ and we know that $\lim_{n\to\infty}a_n=0$. Thus, the infinite series $$\sum_{i=1}^\infty 0^i=0 + 0 + 0+\cdots $$ converges to the real number $0$.

Seeker
  • 3,594
0

As any standard text in calculus will tell you, a limit such as $\displaystyle\lim_{x\to a}f(x)=L$ expresses that, as $x$ gets closer to $a$, its function value $f(x)$ should also become correspondingly closer to the limit $L$. Overlooking the technical details of how that idea is made into a rigid, sensible proposition, that's all there is to it.

So what would $\displaystyle\lim_{x\to \infty}f(x)=L$ or $\displaystyle\lim_{x\to a}f(x)=\infty$ mean? As above, they should mean that, as "$x$ gets closer to $\infty$, yada yada", or "yada yada, then $f(x)$ gets closer to $\infty$". But is any real number closer to $\infty$ than any other? No, by definition, they are all infinitely far from it, whether they are $1$ or $10^{10^{10^{10^{\ldots}}}}$.

In truth, whenever we use the symbol "$\infty$" in a formula involving a limit, we are committing an abuse of notation that conveniently conforms to what we would write for finite numbers. That is, limits in which $x$ or $f(x)$ are "approaching infinity" are not really getting "closer" to it but simply getting boundlessly larger. They are not con-verging to a number but di-verging away. Your formula $\displaystyle \lim_{x\to 0^{+}} \frac{a}{x}=``\infty"$ (which we may otherwise just write as "does not exist") $^{(1)}$ simply means that the expression in the limit gets boundlessly larger as $x$ is brought to $0$ (from the positive side). See MSE Q127689.

Infinity is not a number in the conventional sense. You could (perfectly legitimately) bend the rules of your mathematical theory to allow infinity to be considered a number (as in the extended reals), but in so doing, identities as fundamental as $a-a=0$ no longer necessarily true, and you therefore incur limitations on the arithmetic operations you are able to carry out. See MSE Q60766. Explicitly, the extended real numbers are not a field. The standard workaround to this is to simply prohibit $\infty$ from being used as a number in the theory, in much the same way as we prohibit division by zero. That is, we work strictly in the real numbers (read, finite numbers). So, your implied paradoxical proof of

$$\begin{align}\frac{a}{0}\,&``="\,\infty \\&``="\,\left(1+1+\ldots+1\right) \\a\,&``="\,\left(0+0+\ldots+0\right) \end{align}$$

simply would not hold because we prohibit treating $\infty$ as an ordinary number (and also prohibit division by zero) for the very purpose of avoiding contradictions like this. See MSE Q36289. Your argument was also flawed in that limits cannot be so casually rearranged, and $\lim \frac{a}{b}=\lim c$ does not necessarily imply $\lim a=\lim c\cdot b$. Explicitly, you cannot generally rearrange terms out of the limit operator.

In the case of your sum of zeros, this is essentially to Zeno's paradox, which has a great body of work devoted to it. A good place to start studying would be SEP: Zeno’s Paradoxes. The bottom line is that we would evaluate any sum of zeros as zero, $\displaystyle \sum 0 = 0$, whether it has finitely many of them or infinitely many. In order to sum a set of terms to a nonzero constant (as in an integral), the terms must be (mostly) nonzero.

$^{(1)}$ You ought to have written $0^+$ in your formula since you are strictly approaching $0$ from the positive side; $0^-$ would have given you $-\infty$.

Jam
  • 10,325