I am reading Casella & Berger's Statistical Inference and trying to solve some of the exercises. Looking at the correction of exercise 2.20, the author used a method I don't understand, even introducing a derivative in the infinite sum. Can you please explain the steps that led to $\frac1p$?
$$\mathbb{E}X = \sum_{k=1}^\infty k(1-p)^{k-1}p = p - \sum_{k=1}^\infty\frac{d}{dp}(1-p)^k = -p\frac{d}{dp}\left[\sum_{k=0}^\infty\left(1-p\right)^k-1\right]=-p\frac{d}{dp}\left[\frac{1}{p}-1\right]=\frac{1}{p}$$
Wikipedia and other websites have similar proofs, usually for deriving the expectation of a geometric distribution.