Suppose I have a function $f \in \mathbb{Q}(x, y)$ which does not have a pole at the origin (in particular I can write it as the ratio of two polynomials with integer coefficients in $\mathbb{Z}[x, y]$ where the denominator does not vanish at the origin).
Suppose I know arbitrarily many exact terms of the Taylor expansion of $f$ at the origin, and I have a (relatively small) bound on the degree of both numerator and denominator. What is the best way to guess $f$ from this data?
What I, as a pure mathematician who knows nothing about regressions and numerical methods, would try to do is to restrict to a sufficiently small neighborhood of zero (to stay away from poles), to approximate $f$ with its power series and then use some sort of rational polynomial regression to guess $f$.
The two problems I see with this is first that if I restrict myself to a too small neighborhood of $f$ it might look too flat and it would be impossible to interpolate and second that the series approximation is a rational function itself so my algorithm would probably guess that $f$ is just the input approximation or some truncation of it. However, I hope to solve the second problem by using the hypothesis that I have a bound on the degrees of the polynomials in my fraction: if the neighborhood to which I restricted myself is not too small by adding enough terms I will get something that cannot be well approximated by a small degree polynomial but I will need to add a denominator.
So I have two more precise questions here: what would more experienced people try instead? If my idea is doable, what is a good way to make a regression on a multivariable polynomial rational function with integer coefficients?