1

I've just learned that you can sometimes turn functions into infinite geometric series as long as the independent variable is restricted to an interval of convergence such that the abs value of the common ratio of the geometric series is less than 1.

Apparently, you can also anti-derive that geometric series to come up with a more general power series for the anti-derivative of the original function.

For ex, let's say you start with $f(x)=\frac{3x^2}{1+x^3}$ and come up with the geometric series $\sum_{n=0}^\infty 3x^2 (-x^3)^n$.

Seeing that $f(x)$ is the derivative of $\ln(1+x^3)$, it would make sense that anti-deriving the series would give you an expression for $\ln(1+x^3)$ as an infinite series.

However, I'm wondering what happens to our radius of convergence from the geometric series.

Do the same rules of convergence apply to power series as they do to geometric series?

In this case, anti-differentiating, we get: $$x^3 -\frac{1}{2}x^6+\frac{1}{3}x^9-\frac{1}{4}x^{12}+...$$

There is no common ratio between the terms (that I can think of, at least), so does that mean that there is no consideration of radius of convergence anymore?

What happens to the radius of convergence and why?

Also, if the radius of convergence is $\mid x \mid < 1$, does that mean that the power series only gives us the function over that interval?

  • 1
    The radius of convergence stays the same when you differentiate or antidifferentiate. See http://math.stackexchange.com/a/344951/60500 and https://proofwiki.org/wiki/Radius_of_Convergence_of_Derivative_of_Complex_Power_Series , for example. – Steve Kass Mar 20 '16 at 00:04
  • 1
    The radius of convergence does not change. But there may be endpoint convergence where the original series did not. – André Nicolas Mar 20 '16 at 00:04
  • 1
    To answer your side question at the end, yes, the radius of convergence determines where the series converges. If the series converges to the function, it only converges there for the known radius of convergence. You may be interested in studying uniform convergence of sequences and series of functions for more general theorems and their applications to power series. – Nap D. Lover Mar 20 '16 at 00:09
  • @AndréNicolas, is that because if the new series is alternating, the ratio can be equal to 1 and still have the series converge? For example, the harmonic series doesn't converge but the alternating version of the HS does (at least I think...)? – jeremy radcliff Mar 20 '16 at 00:18
  • 1
    Alternating is not really the main reason. Consider the series $\sum_1^\infty \frac{x^{n-1}}{n}$. This diverges at $x=1$, but when we integrate we get $C+\sum \frac{x^n}{n^2}$, which converges at $x=1$. – André Nicolas Mar 20 '16 at 00:23
  • @AndréNicolas, thanks for the explanation, those series are really tricky... – jeremy radcliff Mar 20 '16 at 00:27
  • @FriedrichPhilipp, do you mean because I forgot the $+C$? Technically it could be after the "..." in the sum, I think. If not, could you point out where? I just checked on WA and got the same result. – jeremy radcliff Mar 20 '16 at 00:28
  • @jeremyradcliff No. It's just because I am stupid. It is completely right. I deleted my comment. Sorry for any confusion. – Friedrich Philipp Mar 20 '16 at 00:31
  • @FriedrichPhilipp, No worries, at even odds you'd make plenty of money systematically betting on their being some kind of algebra mistake in my posts on SE :) – jeremy radcliff Mar 20 '16 at 00:33

1 Answers1

4

If you antidifferentiate a series $$ \sum_{j=0}^\infty a_j z^j \to \sum_{j=1}^\infty \frac{a_{j-1}}{j}z^j $$ By the Cauchy-Hadamard Theorem, the inverse of its radius of convergence is $$ \limsup_{j \to \infty} \left| \frac{a_{j-1}}{j} \right|^{1/j} = \limsup_{j \to \infty} |a_j|^{1/j} $$ Since $|j|^{1/j} \to 1$.

Henricus V.
  • 18,694