I am trying to internalise the measure theoretic definition of conditional expectation.
Consider a fair six-sided die. Formally the probability space is $(\{1, 2, 3, 4, 5,6\}, \mathcal{P}(1, 2, 3, 4, 5, 6), U)$ where $U$ is the discrete uniform distribution. Let the real-valued random variable map the identity map on the sample space so that $X(\omega) = \omega$.
Byron Schmuland answered this question in a way that gives a lot of intuition. Suppose that after the die is rolled you will be told if the value is odd or even. Then you should use a rule for the expectation that depends on the parity. However I still don't see how to formalise his point.
Let the conditioning $\sigma$-field be $\mathcal{G} = \{\emptyset, \Omega, \{1, 3, 5\}, \{2, 4, 6\}\}$ as this includes the events that the value is even or odd. My question is, what is a full and formal description of $E(X | \mathcal{G})$.
Is it this? \begin{equation} E(X | \mathcal{G}) = \begin{cases} 0 & \mbox{if $A = \emptyset$} \\ 3.5 & \mbox{if $A = \Omega$} \\ 3 & \mbox{if $A = \{1, 3, 5\}$} \\ 4 & \mbox{if $A = \{2, 4, 6\}$} \end{cases} \end{equation}
In particular I feel unsure about the cases where $A = \emptyset$ and $A = \Omega$.