It is obvious that if $\vec{X} = (X_1,X_2\cdots,X_n)^T$ are independent random variables, and their marginal characteristic function and joint characteristic function exists, then they are related by $$ Ee^{i\sum_{j=1}^n t_ix_i} = \prod_{j = 1}^n E e^{i t_i x_i}, $$ but is the converse true? That is, if there exists such a factorization, then the variables are independent? My professors says so, but I cannot find proof of it, nor in my literature or online. is it true? Is it true for normal r.v.? If it is true, can you provide a reference or proof?
Asked
Active
Viewed 375 times
7

Mikkel Rev
- 1,850
-
3See https://en.wikipedia.org/wiki/Subindependence – kjetil b halvorsen May 21 '18 at 20:43
-
1@kjetilbhalvorsen Marius is not looking at the sum, but rather the joint distribution... – Elle Najt May 21 '18 at 20:55
-
@kjetilbhalvorsen Oh that's interesting. Apparently this property of MGFs doesn't apply to CFs, I guess. – BCLC May 26 '18 at 17:02
-
@kjetilbhalvorsen, I think there's a difference between subindependence and the formula in the question: It's not one variable t for all X_i, but separate t_i. According to this question the professor was right: https://math.stackexchange.com/questions/287138/moment-generating-functions-characteristic-functions-of-x-y-factor-implies-x?rq=1 – dasWesen Feb 28 '21 at 15:45
1 Answers
3
Try computing what the characteristic function of the product of independent marginals would be (that is, if we treat the $x_i$ in your formula as independent random variables). You will get that it splits up as a product in the same way. Then use the theorem that the characteristic function identifies the density.
(The characteristic function is the same as the Fourier transform, so you may find it easier to find a statement like: if two probability distributions have the same Fourier transform, they are equal. This is also proven somewhere in Durret.)

Elle Najt
- 20,740
-
1Thanks for your interest in this question. I do not understand what you mean. Is it possible to write up a couple of formulas to go with your answer? – Mikkel Rev May 21 '18 at 20:08
-
@MariusJonsson as a start , can you show that if X and Y are independent then the $\phi_(X,Y)(a,b)= \phi_X(a) \phi_Y(b)$? Here phi denotes the characteristic function. – Elle Najt May 21 '18 at 21:36
-
Yes, I call this obvious in my question: Since $X,Y$ independent and exp is continuous, $\phi(X,Y)(a,b)=E [e^{i(Xa + Yb)}] = E [e^{iXa}e^{iYb}] = E [e^{iXa}]E[e^{iYb}] = \phi(X)(a)\phi(Y)(b)$ – Mikkel Rev May 22 '18 at 15:38
-
@MariusJonsson okay, thats really the only computation / formula here. The rest follows from the theorem that says there is a bijection between probability distributions and their characteristic functions. – Elle Najt May 22 '18 at 16:08