(a). The joint distribution $F(x, y) := P(X < x, Y < y)$ clearly depends on the value of $(x, y)$:
- If $x > 1$ and $y > 1$, then clearly $F(x, y) = 1$.
- If $x \leq 0$ or $y \leq 0$, then clearly $F(x, y) = 0$.
- If $0 < x \leq 1$ and $y > 1$ then
$$F(x, y) = P(X < x) = 1 - P(\min(U_1, U_2) \geq x) = 1 - (1 - x)^2 = 2x - x^2.$$
- If $x > 1$ and $0 < y \leq 1$, then
$$F(x, y) = P(Y < y) = P(U_2 < y)P(U_3 < y) = y^2. $$
The only non-trivial case is when $0 < x \leq 1$ and $0 < y \leq 1$, for which by the independence assumption:
\begin{align}
& P(X < x, Y < y) = P(\min(U_1, U_2) < x, \max(U_2, U_3) < y) \\
=& \int_0^1 P(\min(U_1, z) < x, \max(z, U_3) < y)dz \tag{1} \\
=& \int_0^1 [P(\max(z, U_3) < y) - P(\min(U_1, z) \geq x, \max(z, U_3) < y)]dz \\
=& \int_0^1 P(\max(z, U_3) < y)dz - \int_0^1P(\min(U_1, z) \geq x, \max(z, U_3) < y)dz \\
=& \int_0^y P(U_3 < y) dz - \int_0^1P(\min(U_1, z) \geq x, \max(z, U_3) < y)dz \\
=:& y^2 - I
\end{align}
For the integral $I$, if $x \geq y$, then the integrand is $0$ whence $I = 0, F(x, y) = y^2$. If $x < y$, then
\begin{align*}
I = \int_0^1P(\min(U_1, z) \geq x, \max(z, U_3) < y)]dz
= \int_x^y P(U_1 \geq x, U_3 < y) dz = (1 - x)y(y - x).
\end{align*}
It then follows that $F(x, y) = y^2 - y(1 - x)(y - x) = xy + xy^2 - x^2y$. For the veracity of equation $(1)$, refer to Theorem $20.3$ of Probability and Measure by P. Billingsley.
The above calculations can then be summarized as
\begin{align}
F(x, y) = \begin{cases}
0 & (x, y) \in \{(u, v): u \leq 0\} \cup \{(u, v): v \leq 0\}, \\
xy + xy^2 - x^2y & (x, y) \in \{(u, v): 0 < u < v < 1\}, \\
y^2 & (x, y) \in \{(u, v): 0 < v \leq u < 1\} \cup \{(u, v): u > 1, 0 < v \leq 1\}, \\
2x - x^2 & (x, y) \in \{(u, v): 0 < u \leq 1, v > 1\}, \\
1 & (x, y) \in \{(u, v): u > 1, v > 1\}.
\end{cases}
\end{align}
(c). (c) is clearly different from (b), which is identical to $E[I_{\{X = Y\}}]$ (i.e., without multiplying $X$). To evaluate $E[XI_{\{X = Y\}}]$, note that it's the expectation of a non-negative random variable, therefore,
\begin{align}
E[XI_{\{X = Y\}}] = \int_0^\infty P[XI_{\{X = Y\}} > t] dt = \int_0^1 P[XI_{\{X = Y\}} > t] dt = \int_0^1 P[X > t, X = Y] dt. \tag{2}
\end{align}
The probability in the integrand can be evaluated in the same manner as determining $F(x, y)$, for which we apply the Theorem leading to $(1)$ again:
\begin{align*}
& P[X > t, X = Y] = P[\min(U_1, U_2) > t, \min(U_1, U_2) = \max(U_2, U_3)] \\
=& \int_0^1 P[\min(U_1, s) > t, U_1 \geq s, U_3 \leq s] ds \\
=& \int_t^1 P[U_1 > t, U_1 \geq s, U_3 \leq s] ds \\
=& \int_t^1 P[U_1 \geq s, U_3 \leq s]ds \\
=& \int_t^1 s(1 - s)ds = \frac{1}{6} - \frac{1}{2}t^2 + \frac{1}{3}t^3.
\end{align*}
Substituting it into $(2)$, it then follows that
\begin{align*}
E[XI_{\{X = Y\}}] = \int_0^1 \left(\frac{1}{6} - \frac{1}{2}t^2 + \frac{1}{3}t^3
\right) dt = \frac{1}{12}.
\end{align*}