Suppose we have a time series $X_t$ s.t. $X_t \sim^{iid} (0,1)$.
How do you prove that if $ X_t \sim^{iid} (0,1) $, then $ E(X_t^{2}X_{t-j}^{2}) = E(X_t^{2})E(X_{t-j}^{2})$?
Or, I guess, if $X,Y\sim^{iid} (0,1)$ (which implies $E(XY)=E(X)E(Y)$), why then is it that $E(X^2Y^2)=E(X^2)E(Y^2)$?
This spins-off from another question wherein apparently "If the squares were dependent, there's a form of dependence among the unsquared values."
This makes sense, but how does one prove this exactly? My attempt:
Instead of dependence => dependence (which I think would involve probability distributions), I try to prove uncorrelatedness => uncorrelatedness as follows:
$E(X^2Y^2) \neq E(X^2)E(Y^2)$
$\implies E(X^2Y^2) \neq (Var(X)+E(X)^2)(Var(Y)+E(X)^2)$
$\implies Var(XY)+E(XY)^2 \neq (Var(X)+E(X)^2)(Var(Y)+E(Y)^2)$
$\implies Var(XY)+(E(X)E(Y))^2 \neq (Var(X)+E(X)^2)(Var(Y)+E(Y)^2)$
$\implies Var(XY)+(E(X)E(Y))^2 \neq Var(X)Var(Y)+Var(X)E(Y)^2+Var(Y)E(X)^2+(E(X)E(Y))^2$
$\implies Var(XY) \neq Var(X)Var(Y)+Var(X)E(Y)^2+Var(Y)E(X)^2$
$\implies ...$
$\implies E(XY) \neq E(X)E(Y) \ QED$
Ugh...
Additional question: The suggestion in the other question was to make use of the fact that dependence in the squares implies dependence in the originals and then deduce that independence in the originals implies independence in the squares. How does one prove that dependence in the squares implies dependence in the originals?
– BCLC Jun 20 '14 at 12:00