1

Suppose we have a random variable (RV) $X$ defined on a measurable space $\mathcal{M} = (\Omega, \Sigma)$. Suppose we equip the measurable space with a probability measure $P$ and associated expectation operator $E$ such that for all $\theta \in \mathbb{R}$ we have $E[e^{\theta X}] < \infty$. Then, for all bounded, continuous $f : \mathbb{R} \rightarrow \mathbb{R}$ define $F_{\theta}$ as follows:

$$F_{\theta}(f) = \dfrac{E[f(X)e^{\theta X}]}{E[e^{\theta X}]}$$

Question: Show that there exists a probability measure $P_{\theta}$ defined on $\mathcal{M}$ such that for all bounded continuous functions $f$ the following identity holds:

$$F_{\theta}(f) = E_{\theta}[f(X)]$$

where $E_{\theta}[Y]$ is the expectation of any RV $Y$ w.r.t. $P_{\theta}$.


Background: Around 45 minutes into this video, the lecturer defines an operator $E_{\theta}$ w.r.t to a moment generating function and makes a brief argument as to why this operator is in fact the expectation of an implicitly defined probability measure $P_{\theta}$. I can't see how this conclusion can be made.

Note: If the above result is, or relies on, some known theorem I'd be delighted if someone could point me in the direction of that.

Note: in expressions such as $f(X)$ and $e^{\theta X}$ we are composing the RV $X$ with continuous functions to build new RVs. Also in the expression $f(X)e^{\theta X}$ we use both composition with $X$ and mutliplication of two RVs to get a new RV.

Note: I'm not sure if maybe we need to restrict $\theta > 0$. I'm thinking about it...

Colm Bhandal
  • 4,649

2 Answers2

1

Using the definition of expectation and the transfer theorem, we have that :

$$ F_{\theta}(f) := \frac{\mathbb{E}[f(X)e^{\theta X}]}{\mathbb{E}[e^{\theta X}]} = \frac{1}{\mathbb{E}[e^{\theta X}]} \int_{\omega \in \Omega} f(X(\omega))e^{\theta X(\omega)}d\mathbb{P}(\omega) \tag{1}$$

Similarly, we can compute $\mathbb{E}_{\theta}[f(X)]$ :

$$ \mathbb{E}_{\theta}[f(X)] = \int_{\omega \in \Omega} f(X(\omega)) d\mathbb{P}_{\theta}(\omega) \tag{2} $$

If (1) and (2) are equal, we have the equality :

$$\int_{\omega \in \Omega} f(X(\omega)) d\mathbb{P}_{\theta}(\omega) = \int_{\omega \in \Omega} f(X(\omega))\frac{e^{\theta X(\omega)}d\mathbb{P}(\omega)}{\mathbb{E}[e^{\theta X}]} \tag{3} $$

A sufficient condition for equality (3) to hold is to have

$$ d\mathbb{P}_{\theta}(\omega) := \frac{e^{\theta X(\omega)}}{\mathbb{E}[e^{\theta X}]}d\mathbb{P}(\omega) \quad \forall \omega \in \Omega \tag{4} $$

Or, written differently :

$$ \mathbb{P}_{\theta}(A) := \int_{\omega \in A} \frac{e^{\theta X(\omega)}}{\mathbb{E}[e^{\theta X}]}d\mathbb{P}(\omega) \quad \forall A\in \Sigma \tag{4'}$$

Now if condition (4) is fulfilled, you can check that $\mathbb{P}_{\theta}$ defines a probability measure on $(\Omega,\Sigma) $ and that it is by construction such that $F_{\theta}(f) = \mathbb{E}_{\theta}[f(X)]$.

Intuitively, what we want is to find a measure $\mathbb{P}_{\theta}$ under which the expectation is such that $\mathbb{E}_{\theta}[f(X)] = \frac{\mathbb{E}[f(X)e^{\theta X}]}{\mathbb{E}[e^{\theta X}]} $. The naïve way one might try to find such a measure would be to simply "pull out" the $e^{\theta X}$ in the expectation term and set $``\mathbb{P}_{\theta} = \frac{e^{\theta X}}{\mathbb{E}[e^{\theta X}]} \times \mathbb{P}"$. The law of the unconscious statistician (which has quite a fitting name here) allows to write that intuition in a more formal way and shows that it is (somewhat) correct.

  • I sort of get the idea that you're putting across in this answer, and so far it seems to be similar to what the lecturer said. However, I'm a little perturbed by some of the details. You are integrating over $\Omega$ and then trying to define something in terms of $dP(\omega)$ where $\omega \in \Omega$. I think maybe in the application of LOTUS (https://en.wikipedia.org/wiki/Law_of_the_unconscious_statistician) we need to instead swap out for the distribution function over $\mathbb{R}$ rather than $\Omega$. Also terms like $dP(\omega)$ make me uneasy. I'll think about this... – Colm Bhandal Jan 27 '21 at 09:19
  • Yes sure, if the measurable space $\mathcal{M} $ is $(\mathbb{R},\mathcal{B}(\mathbb{R}))$, you can replace accordingly in the integral. If you don't like the $d\mathbb{P}{\theta}$ notation, you can write that $$ \mathbb{P}{\theta}(A) := \int_{\omega \in A} \frac{e^{\theta X(\omega)}}{\mathbb{E}[e^{\theta X}]}d\mathbb{P}(\omega) \quad \forall A\in \Sigma$$ also note that the integral sign refers to a Lebesgue integral – Stratos supports the strike Jan 27 '21 at 09:28
  • Yes actually the reason I expressed concern over $dP$ was because we're assuming Lebesque integration, which is always relative to a measure so I don't like introducing some infinitesimal looking term that looks like Riemann integration.

    I haven't fully digested the maths in your comment. It does look better to me at a first glance.

    However, note that the measurable space $\mathcal{M}$ itself cannot be assumed to be $(\mathbb{R}, \mathcal{B}(\mathbb{R}))$. However, what I was getting at is that with the pushforward measure of $X$ is defined on the Borel measurable space.

    – Colm Bhandal Jan 27 '21 at 13:38
  • 1
    Ah yes it's coming back to me now from measure theory. The Radon-Nikodym theorem on Wikipedia (https://en.wikipedia.org/wiki/Radon%E2%80%93Nikodym_theorem) starts by asserting we can define a new measure given any nonnegative extended real valued measurable function. I'll dig into this, return to your answer, and then see if it all makes sense. – Colm Bhandal Jan 27 '21 at 14:35
  • OK I finally found the related MSE question that proves that the integral of a measurable function over a measurable set in fact induces a measure on the measurable space. I think this is the key piece of the puzzle: https://math.stackexchange.com/questions/3003719/measure-on-a-sigma-algebra-with-integral?rq=1. Thanks for pushing me in the right direction. – Colm Bhandal Jan 27 '21 at 18:19
  • No problem, I'm happy to help. Please ask if anything else is unclear – Stratos supports the strike Jan 27 '21 at 19:46
  • 1
    Thanks. I'm still digging right down to the foundations of measure theory and going down the rabbit hole. I'm not finished with this question or reviewing your answer yet, and probably won't be for a few more days. I might have to post my own answer referencing your one as there are some aspects of rigour I would like to introduce. – Colm Bhandal Jan 28 '21 at 08:43
1

This answer and the subsequent determination of the answerer in the comments helped guide me towards the right solution. However, I still don't fully follow the use of LOTUS which is unfamiliar to me so I will rewrite the basic ideas of that answer in a way that I understand.

Given an arbitrary $\theta$ we are looking for an existential witness $P_{\theta}$ satisfying the condition in the question. With that motivation in mind, let's proceed. We start by defining $h_{\theta}$ as follows:

$$h_{\theta} = \dfrac{e^{\theta X}}{E[e^{\theta X}]}$$

Now, since $h_{\theta} \geq 0$ then by this answer we have that $P_{\theta}$ as defined below is a measure:

$P_{\theta}(A) = \int_A h_{\theta} dP$

However, in order for $P_{\theta}$ to qualify as an existential witness to our problem, we must show that it is a probability measure i.e. we want to show that $P_{\theta}(\Omega) = 1$. Well using the definitions of $P_{\theta}$ and expectation we can reason as follows:

$$P_{\theta}(\Omega) = \int_{\Omega} h_{\theta} dP = \int \dfrac{e^{\theta X}}{E[e^{\theta X}]}dP = \dfrac{\int e^{\theta X}dP}{E[e^{\theta X}]} = \dfrac{E[ e^{\theta X}]}{E[e^{\theta X}]} = 1$$

OK, so now we have a probability measure $P_{\theta}$. Therefore it has an associated expectation $E_{\theta}$. Applying $E_{\theta}$ to the RV $f(X)$ and expanding out the definition of expectation we get:

$$E_{\theta}[f(X)] = \int f(X) dP_{\theta}$$

Now by the identity in this answer and expanding out the definition of $h_{\theta}$ we get:

$$\int f(X) dP_{\theta} = \int f(X) h_{\theta} dP = \int f(X) \dfrac{e^{\theta X}}{E[e^{\theta X}]} dP = \dfrac{\int f(X) e^{\theta X}dP}{E[e^{\theta X}]} = \dfrac{E[f(X) e^{\theta X}]}{E[e^{\theta X}]} = F_{\theta}(f)$$

Which completes the proof.

Colm Bhandal
  • 4,649