0

I have given two random variables $X,Y$ which are independent and uniformly distributed on $[0,1]$. I need to compute the density of $X+Y$.

My idea was to compute $\Bbb{E}(\phi(X+Y))$ where $\phi$ is a mesurable function and then compare the density functions. So $$\Bbb{E}(\phi(X+Y))=\int_{\Bbb{R}^2} \phi(x+y)\cdot 1_{[0,1]}(x)\cdot 1_{[0,1]}(y)~~~\Bbb{P}(X+Y\in dx+dy)$$Is this correct so far, so does this idea works.

Now I need to split $\Bbb{P}(X+Y\in dx+dy)$ but I don't know if this is $\Bbb{P}(X+Y\in dx+dy)=\Bbb{P}(X\in dx)+\Bbb{P}(Y\in dy)$.

Could maybe someone help me?

Thanks for your help

Jose Avilez
  • 12,710
user1294729
  • 2,008
  • 2
    Did you try searching the site first? See https://math.stackexchange.com/questions/220201/sum-of-two-uniform-random-variables and https://math.stackexchange.com/questions/1121641/probability-of-xy-which-are-two-independent-random-variable-uniform-distribut – Joe Mar 26 '22 at 23:11
  • 1
    A more general answer: https://stats.stackexchange.com/a/228431/147896 with a nice graphical explanation : you just have to transform the rectangle into a square to find a triangular distribution. – Jean Marie Mar 27 '22 at 07:23

2 Answers2

3

It is incorrect that $$\mathbb{P}(X+Y \in dx +dy) = \mathbb{P}(X \in dx) + \mathbb{P}(Y \in dy).$$ Take the following example:

$X,Y \sim \text{Uniform([0,1])}$ and $X \perp Y$ then $$\mathbb{P}(X+Y \in [0,1]+[0,1] = [0,2]) = 1 \neq 2 = \mathbb{P}(X \in [0,1] ) + \mathbb{P}(Y \in [0,1]).$$

For random variables $X,Y$ we have the following

$$\mathbb{E}[X+Y] = \mathbb{E}[X]+\mathbb{E}[Y]$$ and $$\text{Var}(X+Y) = \text{Var}(X) + \text{Var}(Y) + 2\text{Cov}(X,Y) $$ and if they are independent then $$f_{X+Y}(z) = \int_{-\infty}^\infty f_X(x)\cdot f_Y(z-x)dx$$

This last operation is called convolution.

Can you continue from here?

oliverjones
  • 4,199
  • sorry I have never seen the convolution. Why do we need this? Because in the onedimensional case we also always computed $\Bbb{E}(\phi(X))$ and then compared the densities – user1294729 Mar 26 '22 at 23:06
  • This is density of the sum that you seek. – oliverjones Mar 26 '22 at 23:07
  • But why can‘t I do it as in the ondimensional case? – user1294729 Mar 26 '22 at 23:09
  • You could find the density using the Moment Generating Function if that is what you are asking. – oliverjones Mar 26 '22 at 23:12
  • So I mean in the ondimensional case we computed $E(\phi(X))=int \phi(x) f(x) dx$ but on the other hand $E(\phi(X))=\int \phi(x) P_X(dx)$ then we could conclude that $P_X(dx)=f(x)dx$ and thus this is the density. Now I tought one could do that in the same way – user1294729 Mar 26 '22 at 23:17
  • Yes certainly this is possible but you can find the density directly using the convolution. Or using the Moment Generating Function. – oliverjones Mar 26 '22 at 23:24
  • Sorry but I have never heard about convolution and Moment Generating Function. Could you maybe give me a hint how to do it in the way I wrote in the last comment. Because I don‘t see how to do this with my way. It would be very nice if you could explain this a bit – user1294729 Mar 26 '22 at 23:27
  • @aprozz, have you thought about using geometry to determine the area of the unit square corresponding to $X+Y<c$, which would be $F(c)$, then taking the derivative to find the PDF? – Joe Mar 26 '22 at 23:46
  • Learn to make web queries ! Taking keywords "density of sum uniform variables" will give you many answers like this one : https://math.stackexchange.com/q/357672/305862 informing you in particular that this density is the triangular function on $[0,2]$. – Jean Marie Mar 27 '22 at 00:05
1

$X$ and $Y$ are independent and uniformly distributed random variables over the interval $[0, 1]$.

To find the PDF of $X + Y$, we define two random variables

$Z = X+ Y $ and $W = Y$.

Then the inverse transformation is:

$X = Z - W$ and $Y = W$.

The Jacobian of the transformation is $J = 1$ (easy calculation).

Since $0 < X < 1$ and $0 < Y < 1$, it is clear that $(Z, W)$ have the range

$ A = \{ (z, w) : 0 < z - w < 1, 0 < w < 1 \}$.

The joint PDF of $(Z, W)$ is obtained as

$f_{Z, W}(z, w) = f_{X, Y}(x, y) |J|$, where $(z, w) \in A$.

Since $X$ and $Y$ are uniformly distributed over the interval $[0, 1]$ and $|J| = 1$, we see that

$f_{Z, W}(z, w) = 1$, where $(z, w) \in A$.

That is,

$f_{Z, W}(z, w) = 1$ \mbox{for} \ \ $0 < z - w < 1, 0 < w < 1$

From this, we obtain the marginal density of $Z = X + Y$ as

$f_Z(z) = \left\{ \begin{array}{cl} z & \mbox{for} \ 0 < z < 1 \\ 2 - z & \mbox{for} \ 1 \leq z < 2 \\ 0 & \mbox{elsewhere} \end{array} \right. $

(A picture of the region $A$ will be useful to understand the integration, as we need to break the integral in the calculation of marginal density for $Z$ in two regions, (1) $0 < z < 1$ and (2) $1 < z < 2$.)

Dr. Sundar
  • 2,677