We claim that the desired probability is
$$ \boxed{ 1 - e^{-\gamma} \approx 0.438541 }, $$
where $\gamma$ is the Euler–Mascheroni constant.
Step 1. Let $Z_1, Z_2, \ldots$ be i.i.d. $\text{Uniform}(0, 1)$-variables. Then the sum in question can be realized as
$$ S = \sum_{k=1}^{\infty} Z_1 \cdots Z_k. \tag{*} $$
From this, it is easy to check that
$$ S \stackrel{d}= U(1 + S), \tag{1} $$
where $U$ is a $\text{Uniform}(0, 1)$-variable independent of $S$ and $\stackrel{d}=$ stands for equality in distribution. So, if $F(\cdot)$ denotes the CDF of $S$, then for $x > 0$,
\begin{align*}
F(x)
= \mathbf{P}(S \leq x)
= \int_{0}^{1} \mathbf{P}(u(1+S) \leq x) \, \mathrm{d}u
\stackrel{(u=x/t)}= x \int_{x}^{\infty} t^{-2} F(t - 1) \, \mathrm{d}t. \tag{2}
\end{align*}
Step 2. The equation $\text{(2)}$ shows that $F(x)$ is absolutely continuous. Moreover, for $x \in (0, 1]$,
$$ F(x) = x \int_{1}^{\infty} t^{-2} F(t - 1) \, \mathrm{d}t = F(1)x $$
and hence the density $f(x) = F'(x)$ satisfies $f(x) = F(1)$ for $x$ between $0$ and $1$.
Step 3. Let $\mu_n = \mathbf{E}[S^n]$ be the $n$th moment of $S$, and let $M(\xi) = \mathbf{E}[e^{\xi S}] = \sum_{n=0}^{\infty} \frac{\mu_n}{n!} \xi^n$ denote the moment generating function of $S$. Then
\begin{align*}
\mu_n
= \mathbf{E}[S^n]
= \mathbf{E}[U^n(1 + S)^n]
= \frac{1}{n+1} \sum_{k=0}^{n} \binom{n}{k} \mu_k.
\end{align*}
From this, we find that $M(\cdot)$ solves the differential equation
$$ (\xi M(\xi))' = e^{\xi} M(\xi). $$
Together with the initial condition $M(0) = 1$, this yields
$$ M(\xi) = \exp \left( \int_{0}^{1} \frac{e^{\xi t} - 1}{t} \, \mathrm{d}t \right). \tag{3} $$
Step 4. Finally, the main claim will follow once we show that $\mathbf{P}(S \leq 1) = e^{-\gamma}$. However,
\begin{align*}
\mathbf{P}(S \leq 1)
= F(1)
&= f(0^+) \\
&= \lim_{r \to \infty} \int_{0}^{\infty} r e^{-rt} f(t) \, \mathrm{d}t \\
&= \lim_{r \to \infty} r \mathbf{E}[e^{-rS}] \\
&= \lim_{r \to \infty} r \exp \biggl( \int_{0}^{1} \frac{e^{-r t} - 1}{t} \, \mathrm{d}t \biggr) \tag{by (3)} \\
&= \lim_{r \to \infty} \exp \biggl( \int_{0}^{r} \frac{e^{-s} - \mathbf{1}_{[0,1]}(s)}{s} \, \mathrm{d}s \biggr) \tag{$s=rt$} \\
&= \exp \biggl( \int_{0}^{\infty} \frac{e^{-s} - \mathbf{1}_{[0,1]}(s)}{s} \, \mathrm{d}s \biggr).
\end{align*}
It is not hard to show that the integral in the last line is equal to $-\gamma$, and therefore the desired conclusion follows. $\square$
Extra Observations. Just for fun, here is an alternative description of the distribution of $S$. Let $\nu$ be a Poisson point process of unit intensity on $\mathbb{R}^2$, and let $N(A) = \nu(A\times[0,1])$ so that $N$ is a Poisson point process of unit intensity on $\mathbb{R}$. Then the sum $S$ defined in $\text{(*)}$ satisfies
$$ S
\stackrel{d}= \int_{(-\infty, 0]} e^{x} \, N(\mathrm{d}x)
= \int_{(-\infty, 0]\times[0, 1]} e^{x} \, \nu(\mathrm{d}x\mathrm{d}y). $$
Now there are two ways to extend the distribution of $S$:
We may let the vertical direction of the domain of integration vary. So, define
$$ L(t) = \int_{(-\infty, 0]\times[0, t]} e^{x} \, \nu(\mathrm{d}x\mathrm{d}y). $$
Then $L$ is a subordinator with drift zero and Lévy measure $x^{-1}\mathbf{1}_{(0,1)}(x) \, \mathrm{d}x$ such that $L(1)$ has the same distribution as $S$. In particular, the distribution of $S$ is infinitely divisible. However, I am not sure if this will help understand the original problem.
Instead, we may let the horizontal direction of the domain to vary. So, let
$$ X(t) = \int_{(-\infty, t]} e^{x-t} \, N(\mathrm{d}x). $$
Then $X$ is a stationary process (since $X(t) \sim S$ for any $t$) that solves the SDE of the form
$$ \mathrm{d}X(t) = -X(t) \, \mathrm{d}t + \mathrm{d}N(t). $$
Slightly abusing notation, this tells that MGF $M(\cdot)$ of $S$ satisfies
\begin{align*}
M(\xi)
&= \mathbf{E}[\exp(\xi X(t+\mathrm{d}t))] \\
&= \mathbf{E}[\exp(\xi (1-\mathrm{d}t)X(t) + \xi \, \mathrm{d}N(t))] \\
&= M(\xi - \xi \, \mathrm{d}t) (1 + (e^{\xi} - 1) \, \mathrm{d}t) \\
&= M(\xi) + [ (e^{\xi} - 1) M(\xi) - \xi M'(\xi) ] \, \mathrm{d}t
\end{align*}
and hence reproduces the differential equation for $M(\cdot)$,
$$ (e^{\xi} - 1) M(\xi) - \xi M'(\xi) = 0. $$
This approach also gives a more transparent view of the phenomenon in which the CDF $F(\cdot)$ of $S$ solves a delay integral equation $\text{(2)}$ and hence is given by different formulas on different intervals of the form $[k-1, k]$, $k = 1, 2, 3, \ldots$. Indeed, this seems related to the occasional jump $\mathrm{d}N(t)$ of unit size.
For me, this seems hinting that the process $X$ can be used to study the distribution of $S$. But again, I don't have any good idea in this direction now.