I try to solve problems from my textbook, but I am stuck on this problem.
We take two points X and Y on the interval (0, 2). What is a probability of the event A = {X + $Y^2$ > 0.1}?
My approach to solve this problem is to modify inequality, so I get Y>$\sqrt{0.1 - X}$. Domain of function Y=$\sqrt{0.1 - X}$ is $(-\infty, 0.1)$, so I would compute the area of an event A as $0.1\times2$ - $\int_{0}^{0.1}\sqrt{0.1 - X}dx$. After that, to obtain a probability I would divide it by area of $\Omega$. Area of $\Omega$ is $2\times2$. But my answer is not correct.
Where do I make mistake?
Many Thanks