For positive integers $s$ and $n$ (let's limit the generality), define $$H_s(n)=\sum_{k=1}^{n}\frac{1}{k^s},\qquad G_s(n)=\sum_{k=1}^{n}\binom{n}{k}\frac{(-1)^{k-1}}{k^s}.$$
The former is well-known; the latter is what I'm wondering about. I met it in a number of contexts. Say, if $X_1,\ldots,X_n$ are independent random variables distributed exponentially with parameter $1$, then $s!G_s(n)$ is the $s$-th moment of $\max\{X_1,\ldots,X_n\}$.
I was looking for an expression of $G_s(n)$ in terms of $H_s(n)$. Here are examples: \begin{align} G_1(n) &= H_1(n), \\ G_2(n) &= (H_2(n)+H_1^2(n))/2, \\ G_3(n) &= (2H_3(n)+3H_2(n)H_1(n)+H_1^3(n))/6, \\ G_4(n) &= (6H_4(n)+8H_3(n)H_1(n)+3H_2^2(n)+6H_2(n)H_1^2(n)+H_1^4(n))/24 \end{align} (the first one is seen in the linked article and in some questions on this site).
The general formula appears to be
$$G_s(n)=\sum_{\substack{a_1,\ldots,a_s\geqslant 0\\ 1\cdot a_1+\ldots+s\cdot a_s=s}}\prod_{k=1}^{s}\frac{H_k^{a_k}(n)}{k^{a_k}\cdot a_k!}.$$
How would one prove it best? Can it be seen from $$\sum_{s=0}^{\infty}G_s(n)t^s=\prod_{k=1}^{n}\Big(1-\frac{t}{k}\Big)^{-1}\tag{$*$}\label{helpeq}$$ which is easy to prove (here $G_0(n)=1$)?