Suppose $X$ and $Y$ are iid's uniform distributions on [0,1]. Then $Z=X+Y$.
Reading through other similar questions such as density of sum of two uniform random variables $[0,1]$, I'm still struggling to understand how exactly the interval $0<z<2$ is reached.
It seems to me just to start this problem it is necessary to define the interval of $z$. I assume that if $X$ and $Y$ can only take on values from $[0,1]$, then at most $z$ can be $2$. Is this the correct way to think about this? More specifically I'm referring to the need of two cases, $0<z<1$ and $1<z<2$, and how they are obtained.