8

In reading about the average case analysis of randomized quick sort I came across linearity of expectations of indicator random variable I know indicator random variable and expectation. What does linearity of Expectation mean ?

Geek
  • 2,349
  • 2
    I do not think it is a good idea to update your question adding new meaning to it, rather than clarifying the first version. Especially, after you got answers to the first version already. Especially, since you are not new to MSE. – SBF Aug 21 '12 at 10:06
  • @Ilya These two properties are used together to analyze the running time of algorithms . So I put them both in one place and I think it makes sense like this . – Geek Aug 21 '12 at 10:07
  • 3
    @Geek: Please return to the original version of your question (the one to which you received two answers). If you have another question, ask it on a different post. – Did Aug 21 '12 at 10:08
  • @Geek your question turned into a new question. I have never heart about independence of expectation. Ilya is right in his comment. – Seyhmus Güngören Aug 21 '12 at 10:08
  • @did on popular demand I reverted back to the original question. – Geek Aug 21 '12 at 10:10
  • @Geek: well, my comment was not about the context of your question - it was about the (preferred) way one asks questions on MSE: the question has to be well-thought. So, people who answer your question can rely upon this fact. If you get the answers to the original question, it is not expected they you will update your question asking for more (well, of course you can update it if asked to clarify the meaning of the original question). Please, take care of this in the future. In the meantime, I updated my answer where I mentioned how the independence plays the role in products. – SBF Aug 21 '12 at 10:12

2 Answers2

7

Let $\xi_1,\xi_2:\Omega\to\mathbb R$ be two random variables on the same probability space $(\Omega,\mathscr F,\mathsf P)$ . The expectation of either is defined by $$ \mathsf E\xi_i:= \int_\Omega \xi_i(\omega)\mathsf P(\mathrm d\omega). $$ The linearity of the expectation means that for any constants $\alpha_1,\alpha_2\in\Bbb R$ it holds that $$ \mathsf E[\alpha_1\xi_1+\alpha_2\xi_2] = \alpha_1\mathsf E\xi_1+\alpha_2 \mathsf E\xi_2 $$ which follows directly from the linearity of the Lebesgue integral in the definition of the expectation. Hence, the functional $\mathsf E$ defined over the space of random variables on the probability space $(\Omega,\mathscr F,\mathsf P)$ is linear.

For the independence over the product, yet again if $\xi_1,\xi_2,\dots,\xi_n$ are random variables on the same probability space as above, and they are mutually independent then $$ \mathsf E\left[ \prod_{i=1}^n\xi_i\right] = \prod_{i=1}^n\mathsf E\xi_i $$

SBF
  • 36,041
  • What does same probability space mean ? – Geek Aug 21 '12 at 10:18
  • Well, you may consider it as a technical condition saying that the notion of the expectation (of a sum/product of several random variables) and independence of random variables are well-defined. – SBF Aug 21 '12 at 10:21
4

The expectaion is a linear operator. This means it satisfies the linearity properties of a function/operator. The linearity is defined as

$$af_1(x_1)+bf_1(x_2)=f_1(ax_1+bx_2)$$

As an example to Expectaion

$$E[\frac{1}{N}\sum_iX_i]=\frac{1}{N}\sum_iE[X_i]$$ and assume that $E[X_i]=\mu$ $\forall i$ then you get

$$\frac{1}{N}\sum_iE[X_i]=\mu$$

SBF
  • 36,041