2

I have recently been introduced to the method to find the characteristic function of a random variable that stems from transformations of other random variables. Say, for example, $X, Y$~$\mathcal{N}(0,1)$ and then I am asked to find the characteristic function of $Z:=XY$

$\hat{P}_{Z}(t)=\hat{P}_{XY}(t)=\mathbb E[e^{itXY}]$ and this is where the key point comes in, I have to take the dependence of one random variable out of the equation.

$\mathbb E[e^{itXY}]=\mathbb E[\int_{\mathbb R} e^{itXy}1_{\{y=Y\}}dy]=\int_{\mathbb R}\mathbb E[e^{itXy}1_{\{y=Y\}}]dy$ and given the independence of $1_{\{y=Y\}}$ and $e^{itXy}$:

$\int_{\mathbb R}\mathbb E[e^{itXy}1_{\{y=Y\}}]dy=\int_{\mathbb R}\mathbb E[e^{itXy}]\mathbb E[1_{\{y=Y\}}]dy=\int_{\mathbb R}\mathbb E[e^{itXy}]P(Y=y)dy$

and $\mathbb E[e^{itXy}]=\exp(-\frac{(ty)^{2}}{2})$ and therefore

$\int_{\mathbb R}\mathbb E[e^{itXy}]P(Y=y)dy=\int_{\mathbb R} \exp(-\frac{(ty)^{2}}{2})P(Y=y)dy=\mathbb E[\exp(-\frac{t^2Y^{2}}{2})]$

Am I at least on the right track? In every case of finding the characteristic function of a random variable, do I need to use the above method of partioning into $1_{\{Y=y\}}$. Any ideas on how to generally go about this are greatly appreciated.

MinaThuma
  • 998
  • 6
  • 16

1 Answers1

1

Presumably, you also assume that $X$ and $Y$ are independent. Then, there are a few methods to derive the characteristic function. Eventually Method 1 and 2 will lead to the same formula by using different ideas.

If $X \sim \mathcal N(0,1)$, then $\mathbb P(X\in \mathrm dx) = \frac 1{2\pi} e^{-x^2/2} \mathrm dx$. Set $\varphi_{XY}(t) := \mathbb E e^{it \ XY}$ etc.

Method 1: Fubini-Tonelli

If $X$ and $Y$ are independent, then $\mathbb P_{(X,Y)} = \mathbb P_X \otimes \mathbb P_Y$ and $$\varphi_{XY}(t) = \mathbb E e^{it \ XY} = \int_{\mathbb R^2} e^{it \ xy}\,\mathbb P_{(X,Y)}(\mathrm dx,\mathrm dy) = \int_{\mathbb R^2}e^{it \ xy} \mathbb P(X \in \mathrm dx) \otimes \mathbb P(Y \in \mathrm dy).$$

By Fubini-Tonelli, \begin{align*} \varphi_{XY}(t) &=\int_{\mathbb R} \Big(\int_{\mathbb R} e^{it \ xy} \, \mathbb P(X \in \mathrm dx)\Big) \mathbb P(Y \in \mathrm dy)\\ &= \int_{\mathbb R} \mathbb E e^{it\ yX} \,\mathbb P(Y \in \mathrm dy)\\ &= \int_{\mathbb R} \varphi_X(ty)\, \mathbb P(Y \in \mathrm dy) = \mathbb E\varphi_X(tY). \end{align*}

Method 2: Tower property of the conditional expectation

If $X$ and $Y$ are independent, then by the tower property, $\mathbb EX = \mathbb E(\mathbb E(X | Y))$. Hence, we see that $$\varphi_{XY}(t) = \mathbb E e^{it \ XY} = \mathbb E\mathbb E\left(e^{it \ XY} \mid Y\right) = \mathbb E\varphi_X(tY).$$

Using the established relation and that $\mathbb Ee^{it \ X} = e^{-\frac{t^2}2}$, thus

$$\mathbb E\left(e^{it \ XY}\mid Y\right) = \mathbb E e^{-\frac{t^2}2 Y^2} = \frac 1{\sqrt{2\pi}} \int_{\mathbb R} \underbrace{e^{-\frac{t^2}2 y^2} e^{-\frac{y^2}2}}_{= e^{-(1+t^2)y^2/2}} \mathrm dy = \frac 1{\sqrt{1+t^2}}.$$

Method 3: Direct calculation

If $X$ and $Y$ are independent, it follows that the joint density is the product of unconditional (marginal) distribution $f_{XY}(x,y) = f_X(x)f_Y(y)$. By a direct calculation,

$$\mathbb Ee^{it \ XY} = \int_{\mathbb R}\int_{\mathbb R} e^{it \ xy} f_X(x)f_Y(y) \mathrm dy \mathrm dx = \quad...\quad = \frac 1{\sqrt{1+t^2}}$$

wueb
  • 768