Let $\epsilon\equiv(\epsilon_1,\dots, \epsilon_J)$ be a random vector. Let $F$ be the probability distribution of $\epsilon$.
Assumption 1: $F$ has full support on $\mathbb{R}^J$.
Assumption 2: $F$ has marginals that are identical and symmetric around zero.
Take a vector of reals $U\equiv (U_1,\dots, U_J)\in [a,b]^J$ with $|a|<\infty, |b|< \infty$.
Consider the quantity $$ \mathbb{E}(\epsilon_{X^*}) $$ where $$ X^*\equiv argmax_{j\in \{1,\dots,J\}}\Big(U_j+ \epsilon_{j}\Big) $$
Note that under Assumption 1, $X^*$ is unique with probability 1. Assumption 1 wil be maintained throughout.
Claim 0: Under Assumptions 1 and 2 (and without further assumptions on $F$), $|\mathbb{E}(\epsilon_{X^*})|$ is not bounded away from $\infty$ (i.e., it might be equal to $\infty$).
QUESTION: I want to show Claim 0. Could you help?
SUB-QUESTION: Suppose we replace Assumption 2 with Assumption 3.
Assumption 3: $\epsilon_1,\dots, \epsilon_J$ are mutually independent and identically distributed. In particular, for $j=1,...,J$, $\epsilon_j$ has a Gumbel distribution with scale 1 and location 0.
Under Assumption 3 (which implies also Assumption 1), we know that $ |\mathbb{E}(\epsilon_{X^*})|<\infty$.
Therefore, I wonder which assumption is actually needed to ensure that $|\mathbb{E}(\epsilon_{X^*})|<\infty$ (while maintaining the full support condition). My prior is that we need some restrictions on the shape of $F$ in the tails. Is this correct? Could you elaborate?