3

Suppose we have independent random variables $X_0,X_1,\ldots,X_{n+1 }$ with distributions $P_0,\ldots,P_{n+1 } $ and let $S_k $ denote the sum $X_0+\cdots+X_k $.

I would like to motivate the following calculation:

$$P\{X_0 \in B_0,\ldots, X_n \in B_n,S_{n+1}\in B \} =\int _ {B_0\times\cdots\times B_n}P_{n+1 }(\{y:y+(x_0+\cdots+x_n)\in B \})P_0(dx_0)\cdot P_n(d x_n)=\int_ {\{X_o \in B_0 ,\ldots,X_n \in B_n\}}P_{n+1}(\{y:y+S_n(\omega) \in B \} )P(d \omega) $$

Thus the first thing would be to verify that the map $(x_o,\ldots,x_n) \mapsto P_{n+1 }(\{y:y+(x_0+\cdots+x_n)\in B \})$ is $\mathcal B(\mathbb R^n)-B(\mathbb R) $ measurable. I'm not sure how to do this.

Then i believe the integral caluclation is motivated from som identification that $\omega \in \{X_0 \in B_0,\ldots, X_n \in B_n,S_{n+1}\in B \} $ if and only if $\omega \in \{X_0 \in B_0,\ldots, X_n \in B_n \}\cap \{\omega:X_0(\omega)+\cdots+X_{n+1 } (\omega) \in B \} $ but then I get stuck relating the probability $P $ of this set to the distribution $P_{n+1 }$ on $\mathbb R $.

Many thanks in advance!

1 Answers1

3

Let's start with the measurability of the integrand.

Lemma: Let $\mu$ be a probability measure on $\mathbb{R}^n$. Then $$x \mapsto \mu(\{y \in \mathbb{R}^n; y+x \in B\})$$ is Borel measurable for any Borel set $B$.

Proof: Let us first assume that $B$ is an open set. By Urysohn's lemma, there exists a sequence of continuous bounded functions $(f_n)_{n \in \mathbb{N}}$ such that $f_n \uparrow 1_B$, i.e. $f_1 \leq f_2 \leq \ldots$ and $1_B = \sup_{n \geq 1} f_n$. It follows from the dominated convergence theorem that $$x \mapsto \int f_n(x+y) \, \mu(dy)$$ is continuous and hence Borel measurable for each $n \geq 1$. Consequently, $$\mu(\{y; x+y \in B\}) = \int 1_B(x+y) \, \mu(dy) = \sup_{n \in \mathbb{N}} \int f_n(x+y) \, \mu(dy)$$ is Borel measurable as a supremum of Borel measurable functions. This proves the assertion for open sets. To extend the assertion to general Borel sets $B$ we can use a monotone class argument. If you don't like it, then check that $$\Sigma := \{B \in \mathcal{B}(\mathbb{R}^n); x \mapsto \mu(\{y; x+y \in B\}) \, \, \text{is Borel measurable}\}$$ defines a $\sigma$-algebra; since we know that the open sets are contained in $\Sigma$ and generate $\mathcal{B}(\mathbb{R}^n)$ this entails $\Sigma = \mathcal{B}(\mathbb{R}^n)$.


Choose $\mu = P_{n+1}$, i.e. $\mu$ is the distribution of $X_{n+1}$. Since

$$(x_0,\ldots,x_n) \mapsto S(x_0,\ldots,x_n) := x_0+\ldots+x_n$$ is Borel measurable, it follows from the above lemma that $$(x_0,\ldots,x_n) \mapsto P_{n+1}(\{y; y+x_0+\ldots+x_n \in B\}) = P_{n+1}(\{y; y+S(x_0,\ldots,x_n) \in B\})$$ is Borel measurable as composition of Borel measurable mappings.

To derive the integral equation we need the transformation formula which states that

$$\mathbb{E}f(Y) = \int f(y) \, dP_Y(y) \tag{1}$$

for any Borel measurable bounded function $f$ where $P_Y$ denotes the distribution of $Y$. Note that this identity does not only hold for real-valued random variables $Y$ but also for random vectors, i.e. random variables taking values in $\mathbb{R}^k$ for some $k \geq 1$. Applying $(1)$ with $$f(y_0,\ldots,y_{n+1}) := 1_B(y_0+\ldots+y_{n+1}) \prod_{j=0}^n 1_{B_j}(y_j) $$ and $$Y := (X_0,\ldots,X_{n+1})$$ we get

\begin{align*} \mathbb{P}(X_0 \in B_0,\ldots,X_n \in B_n, S_{n+1} \in B) &= \mathbb{E}f(Y) \\ &= \int_{\mathbb{R}^{n+1}} f(y) \, dP_Y(y) \tag{2} \end{align*} Since the random variables $X_0,\ldots,X_{n+1}$ are independent, the distribution of $Y=(X_0,\ldots,X_{n+1})$ equals

$$dP_Y(y_0,\ldots,y_{n+1}) = dP_{X_0}(y_0) \cdots dP_{X_{n+1}}(y_{n+1}). $$ Plugging this into $(2)$ and using the very definition of $f$ we get

\begin{align*}& \mathbb{P}(X_0 \in B_0,\ldots,X_n \in B_n, S_{n+1} \in B) \\ = {} & \int_{\mathbb{R}} \ldots \int_{\mathbb{R}} f(y_0,\ldots,y_{n+1}) \, dP_{X_0}(y_0) \cdots dP_{X_{n+1}}(y_{n+1}) \\ = {} & \int_{\mathbb{R}} \ldots \int_{\mathbb{R}} 1_B(y_0+\ldots+y_{n+1}) \prod_{j=0}^n 1_{B_j}(y_j) \, dP_{X_0}(y_0) \cdots dP_{X_{n+1}}(y_{n+1}). \end{align*}

Because of Tonelli's theorem we can rearrange the integrals. To obtain the identity, which we are looking for, we write

\begin{align*}& \mathbb{P}(X_0 \in B_0,\ldots,X_n \in B_n, S_{n+1} \in B) \\ = {} \\ \int_{\mathbb{R}} \ldots \int_{\mathbb{R}} 1_B(y_0+\ldots+y_{n+1}) \prod_{j=0}^n 1_{B_j}(y_j) \, dP_{X_{n+1}}(y_{n+1}) \,dP_{X_0}(y_0) \cdots dP_{X_n}(y_n) \tag{3} \end{align*}

For the inner-most integral we have

\begin{align*} &\int_{\mathbb{R}} 1_B(y_0+\ldots+y_{n+1}) \prod_{j=0}^n 1_{B_j}(y_j) \, dP_{X_{n+1}}(y_{n+1}) \\ = {} & \prod_{j=0}^n 1_{B_j}(y_j) \int_{\mathbb{R}} 1_B(y_0+\ldots+y_{n+1}) \, dP_{X_{n+1}}(y_{n+1}) \\ = {} & \prod_{j=0}^n 1_{B_j}(y_j) P_{X_{n+1}}(\{y; y_0+\ldots+y_n + y \in B\}). \end{align*}

Combining this with $(3)$ gives

\begin{align*}& \mathbb{P}(X_0 \in B_0,\ldots,X_n \in B_n, S_{n+1} \in B) \\ = {} & \int_{\mathbb{R}} \ldots \int_{\mathbb{R}} \prod_{j=0}^n 1_{B_j}(y_j) P_{X_{n+1}}(\{y; y_0+\ldots+y_n + y \in B\}) \,dP_{X_0}(y_0) \cdots dP_{X_{n}}(y_{n}) \\ = {} & \int_{B_0 \times \ldots \times B_n} P_{X_{n+1}}(\{y; y_0+\ldots+y_n + y \in B\}) \,dP_{X_0}(y_0) \cdots dP_{X_n}(y_n) \end{align*}

which proves the first identity.

For the second identity we note that, by the tower property of conditional expectation,

\begin{align*} &\mathbb{P}(X_0 \in B_0,\ldots,X_n \in B_n, S_{n+1} \in B) \\ = {} & \mathbb{E} \bigg[ \mathbb{E} \left( 1_B(S_n+ X_{n+1}) \prod_{j=0}^n 1_{B_j}(X_j) \mid \sigma(X_0,\ldots,X_n \right) \bigg] \\ = {} & \mathbb{E} \left[ \prod_{j=0}^n 1_{B_j}(X_j) \mathbb{E} \left( 1_B(S_n + X_{n+1}) \mid \sigma(X_0,\ldots,X_n) \right) \right]. \tag{4} \end{align*}

As $X_{n+1}$ is independent of $\sigma(X_0,\ldots,X_n)$ and $S_n$ is $\sigma(X_0,\ldots,X_n)$-measurable we have

\begin{align*} \mathbb{E} \left( 1_B(S_n + X_{n+1}) \mid \sigma(X_0,\ldots,X_n) \right) &= h(S_n) \end{align*} for $$h(x) := \mathbb{E}(1_B(x+X_{n+1})) $$ Noting that $$h(x) = P_{X_{n+1}}(\{y; x+ y \in B\})$$ we obtain from $(4)$ that

\begin{align*} &\mathbb{P}(X_0 \in B_0,\ldots,X_n \in B_n, S_{n+1} \in B)\\ = {} & \int_{\{X_0 \in B_0\} \cap \ldots \cap \{X_n \in B_n\}} P_{X_{n+1}}(y; S_n(\omega)+y \in B\}) \, d\mathbb{P}(\omega). \end{align*}

saz
  • 120,083
  • Incredibly good answer! I'm so grateful! Yes you are right about something off in the second equation: the domain of integration should indeed be ${X_0 \in B_0 \times ... \times X_n \in B_n } $. – MrFranzén May 13 '19 at 20:51
  • @MrFranzén I suppose you mean ${X_0 \in B_0} \cap \ldots \cap {X_n \in B_n}$. I have edited my answer, see the last part for the proof of the 2nd identity. – saz May 13 '19 at 21:47