0

If we have a function $f(s)$ with this form:

$$ f(s) = \sum_{i=0}^{\infty} p_i s^i $$

We also know that:

$$ f(1) = 1 $$

and

$$ p_i \ge 0 \quad \text {for all $ i \ge 0$} $$

Assume we can calculate $f(s)$ for any $s$, is it possible that with all the info we know, we would be able to get $p_n$ for any n?

(Actually $p_i$ is the probability that $[Z=i]$ where Z is a random variable.)

1 Answers1

0

This is the discrete version of the moment problem or the infinite version of a Vandermonde matrix. One approach is that $p_0=f(0),\quad p_1=\left.\dfrac{\mathrm df(s)}{\mathrm ds}\right|_{s=0}$ and the higher $p$'s are higher derivatives at $0$. Of course, this is rather unstable numerically.

Ross Millikan
  • 374,822
  • The thing is, according to my understanding of the problem. What I can get is only $f(s)$, not $f'(s)$. In fact, I'm trying to construct a random variable with the same distribution as $Z$, but without knowing $p_0, p_1, p_2, \ldots $, it seems impossible . That's why I'm trying to get all $p_i$'s. – NonalcoholicBeer Oct 01 '11 at 23:42
  • You can take a numeric derivative. Take $s$ smaller and smaller and check $\frac{f(s}-f(0)}{s}$. That is where the instability comes from-you are subtracting two nearly equal quantities. – Ross Millikan Oct 02 '11 at 00:03
  • @ablmf: you can do Richardson extrapolation in conjunction with Ross's "take $s$ smaller and smaller" strategem. It doesn't completely cure the numerical instability, but you might manage to squeeze out a few more digits of accuracy as long as you don't shrink $s$ too much. – J. M. ain't a mathematician Oct 02 '11 at 00:17