23

Let X, Y be two non-negative random variables satisfying the condition $\mathbb{E}[X^\alpha] = \mathbb{E}[Y^\alpha]$ for all $\alpha \in (0, 1/2)$.

How can one show that X and Y are equal in distribution?

Edit: (only if you find this helpful) $\mathbb{E}[X], \mathbb{E}[Y]$ also exist, but a priori one does not know whether they are equal or not.

If you believe that the claim is wrong, I would also be happy to see counterexamples, or at least some intuitive explanations.

  • $X$ and $Y$ are identically distributed if they have same mean and variance. This may help. – Frey Apr 17 '15 at 03:29
  • 9
    I do not assume the existence of first two moments. And in fact I don't think your claim is right. – random_person Apr 17 '15 at 03:33
  • 2
    @Frey: X ~ EXP(mean = 1) has E(X) = V(X) = 1 and Y ~ NORM(1, 1) has E(Y) = V(Y) = 1. They are hardly the same distribution. Maybe you're thinking this is true within the same parametric family. – BruceET Apr 17 '15 at 04:10
  • 1
    @random_person, out of curiosity, why do you believe this might be true? Seen it somewhere? – ki3i Apr 17 '15 at 16:40
  • @random_person Have you tried looking at the lognormal distribution and the perturbed version:

    http://mathoverflow.net/questions/3525/when-are-probability-distributions-completely-determined-by-their-moments

    http://math.stackexchange.com/questions/628681/how-to-compute-moments-of-log-normal-distribution

    – Ilham Apr 17 '15 at 22:27
  • @llham I know that moments do not necessarily characterise a distribution uniquely, and therefore there are conditions by Riesz, Carleman, etc. But here the number of moment conditions I have is uncountable (alpha lies in an interval), unlike classical problems where you are always given the n-th moment where n is natural number. (Do you think in the lognormal counterexample, the moments still match when alpha is, say 0.2?) – random_person Apr 18 '15 at 01:38
  • @ki3i (I mistakenly deleted my comment, so let me put it here again) 30% intuition (dirac delta is dense and I have uncountably many 'linearly independent' moment conditions) + 70% its necessity (this claim comes from another claim about finite-dimensional distributions of two random measures and I don't want to put it here.) – random_person Apr 18 '15 at 01:41

1 Answers1

17

It can be shown that nonnegative random variables $X$ and $Y$ have the same distribution so long as $\mathbb{E}[X^\alpha]=\mathbb{E}[Y^\alpha]$ is finite for all $\alpha\in(a,b]$, any $0\le a <b$.

Setting $U=\log X$ and $V=\log Y$, define the functions $$ f(\alpha)=\mathbb{E}[1_{\{X > 0\}}e^{\alpha U}],\ g(\alpha)=\mathbb{E}[1_{\{Y > 0\}}e^{\alpha V}], $$

These are defined for complex $\alpha$ with $0 < \Re[\alpha]< b$, as the terms inside the expectations are bounded by $\max(1,X^b)$ and $\max(1,Y^b)$ in absolute value. Furthermore, it can be seen that they are complex differentiable in this range. By assumption, they are equal for real $\alpha$ in $(a,b)$. Hence, by analytic continuation, they are equal on the domain $0 < \Re[\alpha] < b$.

Then, for any real $\omega$, dominated convergence gives, \begin{align} \mathbb{E}[1_{\{X > 0\}}e^{i\omega U}]&=\lim_{t\downarrow 0}f(t+i\omega)=\lim_{t\downarrow 0}g(t+i\omega)\\ &=\mathbb{E}[1_{\{Y > 0\}}e^{i\omega V}]. \end{align} Taking $\omega=0$ shows that $X$ and $Y$ have the same probability of being zero. Then, conditioning on $X$ and $Y$, respectively, being strictly positive we see that $U$ and $V$ have the same characteristic functions. Hence, they have the distribution and, therefore, so do $X$ and $Y$.

  • Thank you very much. In fact I was unaware that the moment generating functions were holomorphic until someone reminded me that I could use Fubini and Morera. – random_person Apr 28 '15 at 00:24
  • Nice proof. +1. 1) You said "Taking ω=0 shows that X and Y have the same probability of being zero." Should it not be "... of being positive"? More importantly, why do we need to check this condition? 2) It seems it is not necessary for $a=0$ and taking the limit $t\downarrow$. We can treat $e^{tU}dP(U)$ and $e^{tU}dQ(U)$ where $P$ and $Q$ are the probability measures as the functions to be Fourier inverted and conclude $P=Q$. – Hans Nov 21 '18 at 03:52