I am learning about the Bayes optimal classifier, and there is a step in a proof I struggle with. One can find this proof also on the Wikipedia page: https://en.wikipedia.org/wiki/Bayes_classifier#Proof_of_Optimality
The question arises already in part a). Let me give some definitions first:
Let $(X,Y)$ be random variables with values in $(\mathbb{R}^d, \{0,1\}). $
For $x$ in the support of $X$ let $\eta(x) = \mathbb{P}(Y=1|X=x)$, and let $h$ be a classifier, meaning that $h(X) \in \{0, 1\}$. We further define the risk of a classifier as $R(h) := \mathbb{P}(h(X) \neq Y)$. Let me now state the proof (same as on Wikipedia) and the part where I am struggling. For any classifier $h$ we have:
$$R(h) = P(h(X) \neq Y) = \mathbb{E}_{XY}[\Bbb{1}_{h(X)\neq Y}] = \Bbb{E}_X\Bbb{E}_{Y|X}[\Bbb{1}_{h(X)\neq Y}|X=x],$$
where the last equality is the law of iterated expectations. Continuing, we get
$$\Bbb{E}_X\Bbb{E}_{Y|X}[\Bbb{1}_{h(X)\neq Y}|X=x] \\= \Bbb{E}_X[\Bbb{P}(Y\neq h(X)|X=x)] \\= \Bbb{E}_X[\Bbb{1}_{h(X) = 0}\cdot\Bbb{P}(Y=1|X=x) + \Bbb{1}_{h(X) = 1}\cdot\Bbb{P}(Y=0|X=x)] \\= \Bbb{E}_X[\Bbb{1}_{h(X) = 0}\cdot \eta(x) + \Bbb{1}_{h(X) =1}\cdot(1-\eta(x))] $$
Now the last line is pretty much what is written in the proof on Wikipedia (and also the proof I have seen in class), except that the argument of the function $\eta$ is not $x$ but $X$, in words it is not a point $x$ in the support of $X$ but a random variable. Now I wonder how this exchange is justified. Since I have not taken a class which deals with conditional expectations yet, there might be a somewhat straight-forward justification for this which I'm not aware of. It's also possible that I have made a mistake in the above computations.
I found a very thorough explanation of things that look similar in a thread here (provided by @Stefan Hansen):
https://math.stackexchange.com/a/498338/874549,
but to me this is very advanced, so it's hard to say if that is actually what I'm looking for.
If anyone sees a mistake, or has a somewhat elementary explanation this would be very appreciated!