Let $X$, $Y$ and $Z$ be random variables with probability distributions $f_X, f_Y, f_Z$ taking values in the unit interval $[0,1]$. I'm interested in the distribution of $$X\cdot Y + (1-X)\cdot Z,$$ which always again takes values in $[0,1]$.
Below we see a histogram of samples for $X,Y,Z$ uniformly distributed (and it reminds me of some Shannon entropy function).
It turns out that the distribution of $X\cdot Y$ with two variables is already tricky enough to compute, see Wikipedia, and no literature I found was for the unit interval. Okay so now let's first focus on that two-variable question and even assume uniform distributions of $X$ and $Y$, then I expect I gotta compute something along the lines of
$$\int^x 1 \int^{z/x} 1 \,{\mathbb d}y \,{\mathbb d}x $$
I left out the lower bound $0$ because this makes problems with the log. I suppose my first much reduced question would thus be this one:
Let $X$ and $Y$ be random variables with probability distributions $f_X, f_Y$ taking values in the interval $[\epsilon, 1]$. I'm interested in the distribution of $$X\cdot Y,$$ which takes values in $[\epsilon^2, 1]$?
In this thread such a question is asked an Mathematica is used to give a few expressions involving logs of $z$. I can imagine the result to be something along the lines of $a-b\,\log(z)$ but I'm not sure why $z$ would make it into the nonlinear expression given the above kind of integral.
Assuming we find this, how do deal with the limit $\epsilon \to 0$? And then, for the 3 variable case, we might have to go back to the definition for the mulivariate distribution case (Wikipedia) but I first need to get the $\log$-issue above.
This StackExchange question discusses the case of $n$ uniformly distributed variables, with values in $[0,1]$,for large $n$, in which case we get a Gaussian.