This may be very late but I would like to give my two cents on this question.
Suppose $\mu$ is a probability measure on $((0,\infty),\mathscr{B}((0,\infty))$ and let $F(x):=\mu(0,x]$. The Integrated Hazard Function $Q$ of $\mu$ is defined as
$$
Q(t)=\int_{(0,t]}\frac{1}{1-F(x-)}\mu(dx).
$$
The function $S(t):=1-F(t)$ is a right--continuous monotone nonincreasing function. $Q$ is a right--continuous monotone nondecreasing function whose associated (Lebesgue-Stieltjes) measure $\mu_Q\ll\mu$ satisfies
$$
\begin{align}
\mu_{Q}(\{x\})&=\Delta Q(x)=\frac{\Delta F(x)}{S(x-)}\\
\mu_{Q_c}(dx)&=\frac{1}{S(x-)}\mu_{F_c}(dx)\\
S(x-)\mu_Q(dx)&=\mu(dx)=\mu_F(dx).
\end{align}
$$
where $F_c$ and $Q_c$ is the continuous part of $F$ and $Q$ respectively. Then,
$Q$ and $F$ have the same points of discontinuity $\{x_j:j\in I\}$, and since $S(t)=1-F(t)=1-\int_{(0,t]}\mu(dx)$,
$$
\begin{align}
S(t)=S(0)-\int_{(0,t]}S(x-)\mu_Q(dx)\tag{1}\label{one}
\end{align}
$$
We will show that $S$ is the unique solution to $\eqref{one}$ that is bounded in any bounded set, and that
$$
S(t)=\exp\big(-Q_c(t)\big)\prod_{0<x_j\leq t}
(1-\Delta Q(x_j)).
$$
The proof of this will be a consequence of the following theorem:
Theorem: Let $F$ be a right--continuous monotone nondecreasing function in $[0,\infty)$ and let $\mu_F$ be the unique measure on $(0,\infty)$ such that $\mu\big((a,b]\big)=F(b)-F(a)$. Let $\{x_j:j\in\mathbb{N}\}$ be the sequence of all discontinuities of $F$. If $v\in\mathcal{L}^{loc}_1(\mu_F)$ then, for any number $H_0\geq0$ the function
$$
\begin{align}
H(t)=H_0\exp\Big(\int_{(0,t]}v(x)\mu_{F_c}(dx)\Big)\prod_{0<x_j\leq t}(1+v(x_j)\Delta F(x_j))\tag{2}\label{expo-form}
\end{align}
$$
is the unique solution in $t\geq0$ of the integral equation
\begin{align}
\label{integro-exp}
H(t)=H(0)+\int_{(0,t]}H(x-)v(x)\mu_F(dx)
\end{align}
satisfying $\|H\mathbb{1}_{(0,t]}\|_u<\infty$ for all $t>0$.
The formula quoted in the OP is $\eqref{one}$, and existence and uniqueness are obtained by the Theorem above with $v\equiv-1$.
Since formula $\eqref{expo-form}$ appears often in applications Survival analysis and reliability theory, I think I is worthwhile to present a proof. This will be based entirely on Lebesgue integration by parts.
Preliminary notation:
For any real valued function $F$ on an interval $I$, denote by $\mu_F$ the Lebesgue-Stieltjes measure generated by $F$, so $\mu_F\big((a,b]\big)=F(b)-F(a)$ for all $[a,b]\subset I$.
Recall that for any real valued functions $F$, $G$ of local finite variation in some interval $I$
$$
\int_{(a,b]} F(t)\,\mu_G(dt)=F(b)G(b)-F(a)G(a)-\int_{(a,b]}G(t-)\,\mu_F(dt)
$$
for all $[a,b]\subset I$. This formula may be denoted as
$$
d(FG)=F\,dG+ G_-\,dF
$$
where $G_-(t):=G(t-)$ and $dF(x):=\mu_F(dx)$, that is $dF\big((a,b]\big)=F(b)-F(a)$.
If $G$ is a continuous function of locally finite variation, then
$$ dG^n = n G^{n-1}(t)\,dG$$
This can be shown by induction. For $n=1$ is valid. For $n\geq1$
$$ d(G^{n+1})=d(G^n\,G)=G\,dG^n + G^n\,dG=nG^n\,dG+ G^n\,dG=(n+1) G^n\,dG$$
From this, we obtain the well known exponential formula for continuous measures:
$$\begin{align}
d e^G(t) = e^{G(t)}\,dG(t):= e^{G(t)}\,d\mu_G(dt)\tag{3}\label{exp-for1}
\end{align}
$$
A technical result:
Lemma:
Suppose $G$ is right--continuous nondecreasing in the interval $[0,T)$ $(0<T\leq\infty)$. Then, for any $n\in\mathbb{N}$
$$
\int_{(0,t]}G^{n-1}(s-)\mu_G(ds)\leq \frac{G^n(t)-G^n(0)}{n}\leq\int_{(0,t]}G^{n-1}(s)\mu_G(ds)
$$
for all $0<t<T$. (In differential notation, $nG^{n-1}_-dG\leq dG^n\leq nG^{n-1}dG$.)
Here is a short proof:
For $n\in\mathbb{N}$, $G^n$ is right--continuous an nondecreasing and so, the associates Lebesgue--Stieltjes measure $\mu_{G^n}$ is nonnegative. Repeated application of integration by parts gives
$$
\begin{align}
dG^n &= G^{n-1}_-\,dG + G\,dG^{n-1}=G^{n-1}_-\,dG + G (G^{n-2}_-\,dG + G\,dG^{n-2})\\
&= (G^{n-1}_-+GG^{n-2}_- +\ldots + G^{n-1})\,dG
\end{align}
$$
in differential notation. As $G(s-)\leq G(s)$ for all $0<s\leq T$, we conclude that
$$
n G^{n-1}_-\,dG \leq dG^n\leq n G^{n-1}\,dG
$$
Proof of main Theorem:
As $v\in \mathcal{L}^{loc}_1(\mu_F)$, $v\in\mathcal{L}^{loc}_1(\mu_{F_I})$, and so
$$
\|v\mathbb{1}_{(0,t]}\|_{\mathcal{L}_1(\mu_{F_I})}=\sum_{0<x_j\leq t}|v(x_j)|\Delta F(x_j)<\infty.
$$
Consequently $H$ is bounded on each compact subinterval of $[0,\infty)$.
Let
$$
\begin{align}
G_1(t)&=H_0\prod_{0<x_j\leq t}(1+v(x_j)\Delta F(x_j))\\
G_2(t)&=\exp\Big(\int_{(0,t]}v(x)\mu_{F_c}(dx)\Big).
\end{align}
$$
$G_1$ is right--continuous pure jump function of bounded variation which changes only at $x_j$; moreover,
$$
\begin{align}
\Delta G_1(x_j)=G(x_j)-G(x_j-)&=G(x_j-)\big(1+v(x_j)\Delta F(x_j)\big)-G(x_j-)\\
&= G(x_j-)v(x_j)\Delta F(x_j).
\end{align}
$$
$G_2$ is a continuous monotone nondecreasing function and
$$
\begin{align}
\mu_{G_2}(dx)&=\exp\Big(\int_{(0,x]}v(y)\mu_{F_c}(dy)\Big)v(x)\mu_{F_c}(dx)\\
&= G_2(x)v(x)\mu_{F_c}(dx).
\end{align}
$$
Applying the integration by parts formula to $H(t)=G_1(t)G_2(t)$ gives
$$
\begin{align}
H(t)-H(0)&=\int_{(0,t]}G_1(x-)\mu_{G_2}(dx)+\int_{(0,t]}G_2(x)\mu_{G_1}(dx)\\
&= \int_{(0,t]}G_1(x-)G_2(x)v(x)\mu_{F_c}(dx)+
\sum_{0<x_j\leq t}G_2(x_j)G_1(x_j-)v(x_j)\Delta F(x_j)\\
&= \int_{(0,t]}H(x-)v(x)\mu_{F_c}(dx)+\int_{(0,t]}H(x-)v(x)\mu_{F_I}(dx)\\
&=\int_{(0,t]}H(x-)v(x)\mu_F(dx).
\end{align}
$$
It remains to prove uniqueness. Suppose $H_1$ and $H_2$ are two solutions and set $D=H_1-H_2$. Let $M:=\|D\mathbb{1}_{(0,t]}\|_u$ and $\Lambda(t)=\int_{(0,t]}|v(x)|\mu_F(dx)$. Then,
$$
|D(t)|\leq \int_{(0,t]}|D(x-)||v(x)|\mu_F(dx)\leq M\int_{(0,t]}|v(x)|\mu_{F}(dx)
= M\Lambda(t).
$$
As $\Lambda$ is nondecreasing and right continuous, $|D(x-)| \leq M\Lambda(x-)$. By the technical Lemma above
$$
\begin{align}
|D(t)|&\leq M\int_{(0,t]}\Lambda(x-) |v(x)|\mu_F(dx) = M\int_{(0,t]}\Lambda(x-)\mu_\Lambda(dx)\leq \frac{M}{2} \Lambda^2(t).
\end{align}
$$
Continuing by induction we obtain
$|D(t)|\leq \frac{M}{n!}\Lambda^n(t)$. Letting $n\rightarrow\infty$ gives $|D(t)|=0$.