Short version, yes.
Long version, it's complicated. There is an interesting link between probability distributions and orthogonal polynomials. For instance, consider the Hermite polynomials, $H_0, H_1, \ldots$. These are orthogonal with respect to the weight function $e^{-x^2/2}$ on the support $(-\infty,\infty)$. In other words, $\int_{-\infty}^{\infty}H_i(x)H_j(x)e^{-x^2/2}dx = \sqrt{2\pi}i!\delta_{ij}$.
The weighting function and the coefficient having a factor $\sqrt{2\pi}$ should ring a bell; specifically, you find these represented in a standard normal distribution.
Norbert Wiener recognized that the two are intricately linked and developed what is called the Gauss-Hermite Polynomial Chaos. In short, we can write a random variable $k$ of any distribution as an infinite series about a random variable $\zeta$ of a normal (Gaussian) distribution by using the Hermite polynomials as basis functions: $$ k = \sum_{i=0}^{\infty}k_i H_i(\zeta).$$
It turns out, you can generalize this to other distributions, and to discrete distributions.
There is something called the Askey Scheme, which is essentially a family tree relating hypergeometric orthogonal polynomials. Many of the weighting functions for these polynomials are distribution functions for probability distributions, meaning we can perform a Wiener-Askey Polynomial Chaos expansion for any random variable about another random variable of almost any distribution of our choosing. See also: http://www.dam.brown.edu/scicomp/media/report_files/BrownSC-2003-07.pdf
The hypergeometric family linked to the discrete hypergeometric distribution is the Hahn family of polynomials. The continuous analogue is uncreatively called the "Continuous Hahn" family. This leads to the following answer:
The weighting function of the Continuous Hahn polynomials will give you the continuous analogue of the discrete hypergeometric distribution. In fact, it is most likely identical to within a scaling parameter. These functions are quite complicated, so it is not intuitive.
The polynomial chaos distribution is quite interesting and powerful. If you look at the series expansion, it is analogous to a Taylor expansion of a random variable. Importantly, it allows you to write a random variable in a possibly unknown or complicated distribution in terms of any distribution of your choosing; as with any series expansion, the challenge is to then compute the deterministic coefficients. This becomes quite potent when you have, say, a dynamical system parameterized about a random parameter -- instead of using complicated stochastic models, or time-consuming Monte-Carlo analysis, you can frame the problem as one of computing a handful of deterministic coefficients, and generate statistical moments and/or Monte Carlo samples from them easily.
Interestingly, the first coefficient, $k_0$ always represents the distribution mean.
$$ Y = \sum_{i=0}^P Y_i \Phi_i(\zeta),$$ $$ X = \sum_{i=0}^P X_i \Phi_i(\zeta),$$ $$ Y = H(X) = H\left(\sum_{i=0} X_i(\zeta)\Phi_i(\zeta)\right).$$
Since it is "very easy" to compute the deterministic coefficients $X_i$, all you need to do is compute $Y_i$ using the Galerkin method. Also, $E[Y] = Y_0$.
– Emily Aug 07 '12 at 01:36$\sum_{i=m_i}^m \binom{p_i}{i}\binom{p-p_i}{m-i}/\binom{p}{m} \approx \int_{m_i-1/2}^{m+1/2} \binom{p_i}{x}\binom{p-p_i}{m-x}/\binom{p}{m} dx $ where the binomial coefficients are evaluated with the help of $\Gamma$ function as $\binom{a}{b} = \frac{\Gamma(a+1)}{\Gamma(b+1)\Gamma(a-b+1)}$
– linello Jul 10 '14 at 18:55