Let $X$ be a random variable with $\mathbb{E} X^2 = 1$. Let $X_i$ be i.i.d. copies of $X$ such that $$ \frac{1}{\sqrt{\sum X_i^2}} \left(X_1, ..., X_N\right) $$ is uniformly distributed on $\mathbb{S}^{N-1}$. Prove that $X = \mathcal{N}(0,1)$ in distribution.
-
Does your question mean that $X =N(0,1)$ or as $N$ tends towards infinity $X$ converges in distribution to a $N(0,1)$ random variable. – Nubres Feb 13 '17 at 17:08
-
It means $X = \mathbb{N}(0,1)$ for fixed $N$. – Krishnan Mody Feb 13 '17 at 17:09
-
A similar question. http://math.stackexchange.com/questions/397116/uniform-distribution-on-the-n-sphere – Nubres Feb 13 '17 at 23:30
-
That's the converse of what I'm asking. – Krishnan Mody Feb 14 '17 at 00:55
2 Answers
(Six years late, but maybe someone will be interested still!)
First, we must assume that $\mathbb{P}(X_1 = 0) = 0$, so that the normalized vector is almost surely well-defined.
Bottom line: it is true for $N \ge 3$ but not for $N=1$ or $N=2$.
The answer in the case $N=1$ is that $X_1$ does not have to be normally distributed. Any distribution such that $\mathbb{P}(X_1 > 0) = \mathbb{P}(X_1 < 0) = 1/2$ and $\mathbb{E}(X_1^2) = 1$ will work.
The answer in the case $N=2$ is also that $X_1$ does not have to be normally distributed. All we need is that the distribution of the ratio $X_1/X_2$ is the same as the distribution of $Z_1/Z_2$, where $Z_1$ and $Z_2$ are independent $N(0,1)$ random variables. A simple attempt at constructing a different example is to take $X_1 = 1/Z_2$ and $X_2 = 1/Z_1$. Then $X_1$ and $X_2$ are independent and identically distributed, but unfortunately $E(X_1^2) = \infty$.
Lemma: Let Y be a $\chi(3)$ random variable, with density $\sqrt{\frac{2}{\pi}} x^2 \exp(-x^2/2) dx$ on $(0,\infty)$. (The $\chi(3)$ distribution is also known as the Maxwell-Boltzmann distribution.) Let $U$ be a $U([-1,1])$ random variable (uniform on the interval $[-1,1]$), where $U$ and $Y$ are independent. The the product $UY$ has the $N(0,1)$ distribution.
Calculation-free proof: Let $Z_1$, $Z_2$, and $Z_3$ be independent $N(0,1)$ random variables. Define $Y := \sqrt{Z_1^2 + Z_2^2 + Z_3^2}$. Then $Y$ has the $\chi(3)$ distribution and $Y$ is independent of the normalized vector $(U_1, U_2, U_3) := \frac{1}{Y}(Z_1, Z_2, Z_3)$, which is uniformly distributed on the sphere $\mathbb{S}^2$. Moreover, each co-ordinate $U_i$ ($i = 1,2,3$) has the $U([-1,1])$ distribution. (This is a mathematical expression of the well-known fact that the earth has the same amount of surface area between any two parallel planes that are one meter apart and both intersect the earth.) So we have $Z_1 \sim N(0,1)$ and $Z_1 = U_1 Y$, where $U_1 \sim U([-1,1])$ and $Y \sim \chi(3)$ and $U_1$ and $Y$ are independent, so the lemma is proven.
Now let $U_1, U_2, Y_1, Y_2$ be independent random variables where $U_1, U_2 \sim U([-1,1])$ and $Y_1, Y_2 \sim \chi(3)$. Let $Z_1 = U_1 Y_1$ and $Z_2 = U_2 Y_2$. Then $Z_1$ and $Z_2$ are independent $N(0,1)$ random variables. Also, let $X_1 = \sqrt{3}U_1/Y_2$ and $X_2 = \sqrt{3}U_2/Y_1$, so that $X_1$ and $X_2$ are independent and identically distributed. We have $Z_1/Z_2 = (U_1 Y_1)/(U_2 Y_2) = X_1/X_2$. We have $$E(X_1^2) = 3E(U_1^2)E(1/Y_2^2) = \mathbb{E}(1/Y_1^2) = \int_0^\infty \frac{1}{x^2} \sqrt{\frac{2}{\pi}} x^2 e^{-x^2/2} \,dx = 1.$$
Remark: in this example the joint distribution of $(X_1, X_2)$ is not spherically symmetric (invariant under rotations), even though the vector $\frac{1}{\sqrt{X_1^2 + X_2^2}}(X_1,X_2)$ is uniform on the circle. In other words, the radius and the angle are not independent in this case (unlike the case where the coordinates have the $N(0,1)$ distribution).
This resolves the original question negatively in the case $N=2$. There are many other counterexamples. (For example, you could use the fact that the exp-normal distribution is infinitely divisible to produce an infinite family of counterexamples. This infinite divisibility is shown by Iosif Pinelis in a short note here: arXiv:1803.09838.)
Now suppose $N \ge 3$. We will prove that $X_1 \sim N(0,1)$, as proposed in the original question. The hypothesis that $\frac{1}{\sqrt{X_1^2 + \dots + X_N^2}}(X_1, \dots, X_N)$ is uniform on $\mathbb{S}^{N-1}$ implies that the distribution of $\frac{1}{\sqrt{X_1^2 + X_2^2 + x_3^2}}(X_1, X_2, X_3)$ is uniform on $\mathbb{S}^2$. (This is because almost surely $X_i \neq 0$ for all $i=1, \dots, N$, so normalizing, projecting to $\mathbb{R}^3$ and normalizing again has the same effect as projecting to $\mathbb{R}^3$ and normalizing.) So it suffices to solve the case $N=3$. By applying orthogonal projection to the plane spanned by $(0,0,1)$ and $(\cos \theta, \sin \theta, 0)$, followed by another normalization, we see that the distribution of $\frac{(\cos \theta) X_1 + (\sin \theta) X_2}{X_3}$ does not depend on $\theta$. By considering the characteristic functions of the independent random variables $\log(|(\cos \theta) X_1 + (\sin \theta) X_2|)$ and $\log |X_3|$, (whose difference has a distribution that does not depend on $\theta$), we discover that the distribution of $(\cos \theta) X_1 + (\sin \theta) X_2$ does not depend on $\theta$. This implies that the characteristic function of $(X_1, X_2)$ is rotationally invariant, and Lévy's inversion formula for characteristic functions now implies that $(X_1,X_2)$ is rotationally invariant.
At this point we can follow the previous answer for the case $N=2$, which assumed rotational invariance. I will just explain how to make it rigorous. We know that the distribution of $(X_1, X_2)$ is a mixture of uniform distributions on the circles in $\mathbb{R}^2$ centered on $(0,0)$. Therefore $X_1$ is a mixture of continuous symmetric random variables whose density is continuous at $0$ and positive at $0$. Thus $X_1$ is a continuous symmetric random variable whose density is continuous at $0$ and positive at $0$. That is, $X_1$ has a density $h(x^2)$ where $\tilde{h}(y) = h(y)/h(0)$ is a function that is continuous at $0$ and positive at $0$. It follows that $\sqrt{X_1^2 + X_2^2}$ also has a density. Now the argument in the previous answer can be used to deduce that $X_1 \sim N(0,1)$. Along the way one can use the equation $$\tilde{h}(x_1^2 + x_2^2) = \tilde{h}(x_1^2) \tilde{h}(x_2^2)\,,$$ to show that $\tilde{h}$ is strictly positive on all of $\mathbb{R}$, and hence $ x \mapsto \log \tilde{h}(x^2)$ is a well-defined function which satisfies the Cauchy functional equation and is continuous at $0$, so it must be linear.

- 51
- 3
This isn't super rigorous, but it's an old argument that is historically important (for instance, it was used in deriving the Maxwell-Boltzman distribution, I think.)
We know that the distribution of the vector $\vec X$ must be spherically symmetric, so $$ f_{\vec X}(x_1,\ldots,x_N) = g(x_1^2+\ldots +x_N^2).$$
We also know that the components of the vector are independent and that their distributions are symmetric, so that $$f_{\vec X}(x_1,\ldots,x_N) = h(x_1^2)h(x_2^2)\ldots h(x_N^2)$$ so we have a functional equation $$ g(x_1^2+\ldots +x_N^2) = h(x_1^2)\ldots h(x_N^2).$$
Letting $x_2=0,x_3=0\ldots x_N=0,$ we get $g(x_1^2) = h(x_1^2)h(0)^{N-1}$ so we can write $$ h(0)^{N-1}h(x_1^2+\ldots +x_N^2) = h(x_1^2)\ldots h(x_N^2)$$ or, letting $\tilde h(x) = h(x)/h(0),$ $$ \tilde h(x_1^2+\ldots+x_N^2) = \tilde h(x_1^2)\ldots \tilde h(x_N^2).$$
It is known known that the only nice solution to the functional equation $f(x_1+\ldots +x_N) = f(x_1)\ldots f(x_N) $ is $e^{ax}$ so we have $$ h(x^2) = h(0)e^{-bx^2}$$ and clearly $b$ must be positive for the distribution to be valid.
So finally, we have $$ f_{\vec X}(x_1,\ldots x_n) = C e^{-b(x_1^2+\ldots x_N^2)}$$
So we have $X_i$ is an $N(0,\sigma^2)$ for some variance $\sigma^2.$ Since you have $E(X^2) = 1$ it must be $N(0,1).$

- 58,508