Suppose X is a continuous random variable that can take any value between plus and minus infinity. Furthermore, suppose A is a random variable capturing those events where X is below 0, and B is a random variable capturing those events where X is above 0.
Is there a general relationship between variance(X), variance(A), and variance(B)?
UPDATE: Siong and Canardini provide the same answer. Unfortunately, my simulation efforts do not agree, so I wonder where the mistake is:
In the following I draw 10 numbers (MATLAB, randn) and try to implement the given answers. What am I doing wrong?