The general idea
It is a well-known fact that the Stieltjes transform $s_\mu$ of a probability measure $\mu$ on $\mathbb R$ is analytic on $\mathbb C\setminus\text{supp}(\mu)$. So if I wanted to prove that $\text{supp}(\mu)\subseteq[-C,C]$ (for some minimal $C$) I could calculate the radius of convergence of a power series representation of $s_\mu(1/\cdot)$ and take $C$ as its inverse. (This corresponds to a coordinate change of $z\mapsto\frac{1}{z}$.)
Example
Let $\mu = \delta_C$, so that $s_\mu(z):=\int_\mathbb{R}\frac{d\mu(x)}{z-x}=\frac{1}{z-C}$, then $$s_\mu(1/z)=\frac{1}{1/z-C}=z\frac{1}{1-Cz}=z\sum_{n=0}^\infty (Cz)^n$$ which has radius of convergence $1/C$ so we see that $\text{supp}(\mu)\subseteq[-C,C]$, as expected.
The question
Given $s_\mu(1/z) = f(z)+g(z)$, where $f(z)=\sum_{n=0}^\infty a_n z^n$, $g(z)=\sum_{n=0}^\infty b_n z^n$, and $h(z)=\sum_{n=0}^\infty (a_n+b_n)z^n$ with radii of convergence $R_f$, $R_g$, and $R_{f+g}$ respectively. Is there an example where $\min\{R_f,R_g\}\leq 1/C<R_{f+g}$ for $C$ minimal such that $\text{supp}(\mu)\subseteq[-C,C]$? Or in other words: If two singularities cancel out, can we always pretend they weren't there in the first place?
My thoughts so far
Clearly $\min\{R_f,R_g\}<R_{f+g}=1/C$ can be the case: e.g. write $\delta_C$ in a stupid way: $\delta_C=\delta_C+\delta_K-\delta_K$ to get $f(z)=z\sum_{n=0}^\infty (C^n+K^n)z^n$, $g(z)=-z\sum_{n=0}^\infty K^nz^n$, and $h(z)=z\sum_{n=0}^\infty C^nz^n$. Choosing $K>C$ we have $R_f=R_g=1/K<1/C=R_{f+g}$, where $1/C$ is the true bound by the above calculation.
Rephrasing the question
I guess my question is essentially how to define the radius of convergence of a sum of two functions that can be represented via a power series.
- One option being obtained by a formal†calculation: $R_{f+g}$ as above, but I'm not sure if that always works out.
- The more conservative option: $\min\{R_f,R_g\}$ in which case we may lose out on a lot.
†formal because in general we need $f+g$ to be defined in an open disk around $0$ to be able to define a power series of $f+g$ analytically. This may not be given, for example take some $f$ that have singularities at $z=1/m$ for all $m\in\mathbb N$. Also, in general (Interchange finite and infinite sum) we don't have $\sum_n a_nz^n+\sum_n b_nz^n=\sum_n(a_n+b_n)z^n$ (for $|z|\geq \min\{R_f,R_g\}$).
Note that I am aware of questions like Radius of convergence of a sum of power series, but that there is a significant difference to what I'm asking.