5

Let $(\Omega,\Sigma,P)$ be a probability space and let $X\colon \Omega \to \mathbb R$ and $Y\colon \Omega \to \mathbb R$ be continuous random variables with density functions $f_X$ and $f_Y$, respectively. I would like to precisely define the conditional distribution of $X$ given that $Y = y$, where $y \in \mathbb R$ and $f_Y(y) > 0$. The difficulty in doing this is that the event $Y = y$ has probability $0$.

As a first step, we can attempt to define $P(A \mid Y = y)$, where $A \subset \Omega$ is an event. A key idea is to note that while the event $Y = y$ has measure $0$, the event $Y \in [y, y + \Delta y]$ has positive probability for any number $\Delta y > 0$. This suggests that we can define $$ \tag{1} P(A \mid Y = y) = \lim_{\Delta y \to 0} P(A \mid Y \in [y,y + \Delta y]). $$ Question: Does the limit on the right exist? If so, is the function $P(\cdot \mid Y = y)$ defined in equation (1) a probability measure?

Additional question: Is this the standard approach to rigorously defining the conditional distribution of $X$ given that $Y = y$? If not, what is the standard approach? Please recommend a book that explains this clearly, as it is glossed over in most probability books I have looked at.

1 Answers1

4

This is not the most standard approach, since this is limited to the case where the density is positive everywhere, and even then can have issues; you can look up Borel-Kolmogorov paradox (i.e. how you take the limits can matter to form the object that you want.)

If you want a reference, you can read from Durrett PTE 4.1 chapter 5.1.3 on regular conditional probabilities to learn about conditional probability distributions. There is also a nice answer on StackExchange: Formal definition of conditional probability

E-A
  • 5,987