The identity I want help proving is the following (given $m$ probabilities, $p_j$ such that $\sum_j p_j = 1$): $$ \int\limits_0^\infty t \sum\limits_j \left(\prod\limits_{k \neq j}(1-e^{-p_k t}) \right)e^{-p_jt}p_j dt = \sum\limits_j\frac 1 p_j - \sum\limits_{i<j}\frac {1}{p_i+p_j} + \dots +(-1)^{m-1} \frac{1}{p_1+\dots+p_m}$$
For background and motivation, see below.
In example 5.17 of the book, Introduction to probability models by Sheldon Ross, the Coupon collector's problem is tackled for the general case where the probability of drawing coupon $j$ is given by $p_j$ and of course, $\sum\limits_j p_j = 1$. Now, he defines $X_j$ as the first time a coupon of type $j$ is observed, if the $j$th coupon arrives in accordance to a Poisson process with rate $p_j$. We're interested in the time it takes to collect all coupons, $X$. So we get:
$$X = \max_{1\leq j \leq m}X_j$$
Further, since the $X_j$ are independent (discussion on that here), we get:
$$F_X(t) = P(X<t) = P(X_j<t \; \forall \; j) = \prod\limits_{j=1}^{m}(1-e^{-p_j t})\tag{1}$$
Now, Ross uses the expression: $E(X) = \int\limits_0^\infty S_X(t)dt$, where $S_X(t)$ is the survival function to get:
$$E(X) = \int\limits_{0}^{\infty}\left(1-\prod\limits_{j=1}^{m}(1-e^{-p_j t})\right) dt = \sum\limits_j\frac 1 p_j - \sum\limits_{i<j}\frac {1}{p_i+p_j} + \dots +(-1)^{m-1} \frac{1}{p_1+\dots+p_m}\tag{2}$$
Now, I want to get this same result using the old-fashioned definition of the expected value. For this, I differentiate equation (1) to get the PDF of $X$. First, let's take logarithm on both sides.
$$\log(F_X(t)) = \sum\limits_j \log(1-e^{-p_j t})$$
Now differentiate with respect to $t$.
$$\frac{f_X(t)}{F_X(t)} = \sum\limits_j \frac{p_j e^{-p_j t}}{1-e^{-p_j t}}$$
Finally yielding:
$$f_X(t) = \sum\limits_j \left(\prod\limits_{k \neq j}(1-e^{-p_k t}) \right)e^{-p_jt}p_j$$
Using this, we get an alternate expression for the expectation:
$$E(X) = \int\limits_0^\infty t f_X(t) dt = \int\limits_0^\infty t \sum\limits_j \left(\prod\limits_{k \neq j}(1-e^{-p_k t}) \right)e^{-p_jt}p_j dt$$
This should lead to the same expression as in equation (2). However, I don't know where to start. Why do I want to do it through this alternate route? Because I hope to find an expression for the variance as well and for that, need $E(X^2)$. Thought I'd tackle the easier, $E(X)$ for which we know there is a nice expression first.