1

I try to solve problems from my textbook, but I am stuck on this problem.

We take two points X and Y on the interval (0, 2). What is a probability of the event A = {X + $Y^2$ > 0.1}?

My approach to solve this problem is to modify inequality, so I get Y>$\sqrt{0.1 - X}$. Domain of function Y=$\sqrt{0.1 - X}$ is $(-\infty, 0.1)$, so I would compute the area of an event A as $0.1\times2$ - $\int_{0}^{0.1}\sqrt{0.1 - X}dx$. After that, to obtain a probability I would divide it by area of $\Omega$. Area of $\Omega$ is $2\times2$. But my answer is not correct.

Picture of area A

Where do I make mistake?

Many Thanks

Janey
  • 13
  • To get braces in MathJax you have to escape them with a backslash because they are used for grouping multicharacter things. To get your set you would write {X+Y^2 \gt 0.1} to get ${X+Y^2 \gt 0.1}$ It looks much nicer that way. – Ross Millikan Nov 05 '17 at 22:01

1 Answers1

0

The integral you compute gives the area that $X+Y^2 \lt 0.1$. You should subtract that from the area of the whole $2 \times 2$ square, then divide by $4$ to get the probability. You have missed all the area where $X \gt 0.1$

Ross Millikan
  • 374,822