1

I am working this question:

Let $X$ and $Y$ be $[0,1]-$valued random variables such that $E[X^{n}]=E[Y^{n}]$ for every integer $n\geq 0$. Show that $E[f(X)]=E[f(Y)]$ for every continuous function $f:[0,1]\longrightarrow\mathbb{R}$ and conclude that $X=_{d}Y$. (Hint: use the Weierstrass approximation theorem)

Similar questions have been posted here:Show two random variables have same distribution

and here: Proof of if two random variables have the same distribution then they have the same moment generating function.

However, it seems that this question is asking me firstly to prove $E[f(X)]=E[f(Y)]$ for every continuous function, and then use this fact to deduce they have the same distribution.

The Weierstrass approximation theorem is here:

If $f$ is a real-valued function on $[a,b]$ and if any $\epsilon>0$ is given, then there exists a polynomial $P$ on $[a,b]$ such that $|f(x)-P(x)|<\epsilon$ for all $x\in [a,b]$.

So firstly I show that $n^{th}$ moment of random variable is always a polynomial? Even if I showed this, it seems that the argument is backward.

Answer with some more details will be really appreciated since I got really lost...

Thank you!

Edit 1:

As what the comment suggested, I should show $E[p(X)]=E[p(Y)]$ for all polynomial. However, I am reading Durrett, and there is nothing related to the relation between $n^{th}$ moment of a random variable and the expectation of a polynomial. I don't really know what to do here.

Also, as I pointed out in the comment of the first answer, if I know they have the same $n^{th}$ moment for each $n$, then using series expansion to the moment generating function immediately yields me that they have the same moment generating function and thus they have the same distribution. So why would I need to deduce this fact from $E[f(X)]=E[f(Y)]$?

  • 1
    First Show that if $p$ is a polynomial then $E[p(X)] = E[p(Y)]$. Then think about why this helps you. – PhoemueX Oct 10 '19 at 17:22

1 Answers1

4

Take an arbitrary polynomial $p(x) = c_n x^n + c_{n-1}x^{n-1}+ \dots + c$. Then

$$ \mathbb{E}[p(X)] = c_n\mathbb{E}[X^n] + c_{n-1} \mathbb{E}[X^{n-1}] + \dots + c = \\ c_n\mathbb{E}[Y^n] + c_{n-1} \mathbb{E}[Y^{n-1}] + \dots + c = \mathbb{E}[p(Y)] $$

Now take any function $f : [0,1] \to \mathbb{R}$ and corresponding polynomial $p$ from Weierstrass then

$$ \left|\mathbb{E}[f(X)] - \mathbb{E}[f(Y)]\right| = \\ \left|\mathbb{E}[f(X) - p(X) + p(X)] - \mathbb{E}[f(Y) - p(Y) + p(Y)]\right| \leq \\ 2 \epsilon + \underbrace{\left|\mathbb{E}[p(X) - p(Y)] \right|}_{equal} \leq 2 \epsilon $$

Now since $X,Y$ are bounded the moment generating functions exists and since $\mathbb{E}[f(X)] = \mathbb{E}[f(Y)]$. The moment generating functions are equal and hence $X =^{d} Y$

Olba12
  • 2,579
  • well if they have the same $n^{th}$ moment for each $n$, then they by series expansion have the same moment generating function so they must have the same distribution, why do I need to take a detour to show these things? Also I don't really know how to show $E[p(X)]=E[p(Y)]$, would you like to provide some details? – JacobsonRadical Oct 10 '19 at 18:17
  • @JacobsonRadical Yes the finite series are equal. But how can you be sure that they are equal for an infinite series? – Olba12 Oct 10 '19 at 18:27
  • Yes.. you are right... could you please provide more details of the proof? – JacobsonRadical Oct 10 '19 at 18:29
  • @JacobsonRadical take arbitrary polynomial. Then use linearity of expected value and that constants can be moved outside of the expectation – Olba12 Oct 10 '19 at 18:31
  • I don’t think this is correct, as per this answer: https://math.stackexchange.com/questions/1166637/do-moments-define-distributions – Ant Oct 10 '19 at 18:43
  • @Ant X and Y are bounded by 0 and 1. Thus $0 \leq E[e^{\rho |X|}] \leq e^{\rho}$ for any $\rho>0$ and thus (as per comment in your link) we can do this – Olba12 Oct 10 '19 at 18:52
  • Sure, I agree. But your answer doesn’t use that property, so the proof can’t be right – Ant Oct 10 '19 at 19:21
  • @Ant not explicity. But I use that X and Y are bounded in order to guarantee that the moment generating function exist. https://math.stackexchange.com/q/467453/229160 – Olba12 Oct 10 '19 at 19:42
  • I don't think we can directly use linearity, if we have $E[p(X)]$ then by the change of variable formula, we have $E[p(X)]=\int p(x)\mu(dx)$, in which $p(x)$ cannot be directly pulled out of the integral. – JacobsonRadical Oct 10 '19 at 20:19
  • oh! That's what you meant..Thank you! Just one question, if I don't want to use the moment generating function, could I set $f(x):=e^{itx}$ which is continuous on $x\in[0,1]$, so that $E[f(X)]=\phi_{X}(t)$ where $\phi_{X}(t)$ is the characteristic function, and then conclude they have the same distribution? – JacobsonRadical Oct 10 '19 at 20:45
  • @JacobsonRadical The characteristic function is a complex valued function and then you are not following the "description" in the problem – Olba12 Oct 10 '19 at 20:49
  • so using moment generating function is the only way? I am just not sure if I need to prove that moment generating function determines the distribution uniquely... which is a long proof. – JacobsonRadical Oct 10 '19 at 20:51
  • 1
    I will just use moment generating function. Thank you so much :) – JacobsonRadical Oct 10 '19 at 20:56