It can be shown that nonnegative random variables $X$ and $Y$ have the same distribution so long as $\mathbb{E}[X^\alpha]=\mathbb{E}[Y^\alpha]$ is finite for all $\alpha\in(a,b]$, any $0\le a <b$.
Setting $U=\log X$ and $V=\log Y$, define the functions
$$
f(\alpha)=\mathbb{E}[1_{\{X > 0\}}e^{\alpha U}],\ g(\alpha)=\mathbb{E}[1_{\{Y > 0\}}e^{\alpha V}],
$$
These are defined for complex $\alpha$ with $0 < \Re[\alpha]< b$, as the terms inside the expectations are bounded by $\max(1,X^b)$ and $\max(1,Y^b)$ in absolute value. Furthermore, it can be seen that they are complex differentiable in this range. By assumption, they are equal for real $\alpha$ in $(a,b)$. Hence, by analytic continuation, they are equal on the domain $0 < \Re[\alpha] < b$.
Then, for any real $\omega$, dominated convergence gives,
\begin{align}
\mathbb{E}[1_{\{X > 0\}}e^{i\omega U}]&=\lim_{t\downarrow 0}f(t+i\omega)=\lim_{t\downarrow 0}g(t+i\omega)\\
&=\mathbb{E}[1_{\{Y > 0\}}e^{i\omega V}].
\end{align}
Taking $\omega=0$ shows that $X$ and $Y$ have the same probability of being zero. Then, conditioning on $X$ and $Y$, respectively, being strictly positive we see that $U$ and $V$ have the same characteristic functions. Hence, they have the distribution and, therefore, so do $X$ and $Y$.
http://mathoverflow.net/questions/3525/when-are-probability-distributions-completely-determined-by-their-moments
http://math.stackexchange.com/questions/628681/how-to-compute-moments-of-log-normal-distribution
– Ilham Apr 17 '15 at 22:27