Let $X$, $Y$ and $Z$ be random variables,
¿$Cov(X,Y) = 1$ and $Cov(Y,Z) = 1$ implies $Cov(X,Z) = 1$?
I'm trying to see if this is true or not, however I can't find where to attack the problem, can someone give me a clue on how to tackle it?
Let $X$, $Y$ and $Z$ be random variables,
¿$Cov(X,Y) = 1$ and $Cov(Y,Z) = 1$ implies $Cov(X,Z) = 1$?
I'm trying to see if this is true or not, however I can't find where to attack the problem, can someone give me a clue on how to tackle it?
In general it is not true. Consider the following example: Let $Y=X+Z\implies Cov(X,Y)=Cov(X,X+Z)=Var(X)+Cov(X,Z)$, also $Cov(Y,Z)=Cov(X+Z,Z)=Cov(X,Z)+Var(Z)$. We wish to have the condition that $Cov(X,Y)=Cov(Y,Z)=1$ $$\implies Var(X)+Cov(X,Z)=Cov(X,Z)+Var(Z)\implies Var(X)=Var(Z)=1$$ We suppose that the variances are equal to 1 to illustrate the example. And from $Cov(Y,Z)=Cov(X,Z)+Var(Z)=1$ it follows that $Cov(X,Z)=1-Var(Z)=1-1=0$. So, even without any assumptions on distributions we get that $Cov(X,Y)=Cov(Y,Z)=1$ implies $Cov(X,Z)=0$. This is one possible counterexample.
It does not. Let $Y=X+Z$ where both $X$ and $Z$ are iid. Then the distributions for $X$ and $Z$ can be calibrated so that $Cov(X,Y)=Cov(Y,Z)=1$ . But $Cov(X,Z)=0$ by construction.
If you replace covariances with correlations, then $$ \operatorname{Corr}(X,Y)=\operatorname{Corr}(Y,Z)=1 \Rightarrow\operatorname{Corr}(X,Z)=1. $$ Indeed, we can write $X=a+bY$ and $Y=c+dZ$ a.s. where $b=\sigma_X/\sigma_Y$ and $d=\sigma_Y/\sigma_Z$. Thus, $$ X=(a+bc)+(bd)Z \quad\text{a.s.} $$ In particular, your assertion is true if $\sigma_X=\sigma_Y=\sigma_Z=1$.