2

Let $\epsilon\equiv(\epsilon_1,\dots, \epsilon_J)$ be a random vector. Let $F$ be the probability distribution of $\epsilon$.

Assumption 1: $F$ has full support on $\mathbb{R}^J$.

Assumption 2: $F$ has marginals that are identical and symmetric around zero.

Take a vector of reals $U\equiv (U_1,\dots, U_J)\in [a,b]^J$ with $|a|<\infty, |b|< \infty$.

Consider the quantity $$ \mathbb{E}(\epsilon_{X^*}) $$ where $$ X^*\equiv argmax_{j\in \{1,\dots,J\}}\Big(U_j+ \epsilon_{j}\Big) $$

Note that under Assumption 1, $X^*$ is unique with probability 1. Assumption 1 wil be maintained throughout.

Claim 0: Under Assumptions 1 and 2 (and without further assumptions on $F$), $|\mathbb{E}(\epsilon_{X^*})|$ is not bounded away from $\infty$ (i.e., it might be equal to $\infty$).


QUESTION: I want to show Claim 0. Could you help?


SUB-QUESTION: Suppose we replace Assumption 2 with Assumption 3.

Assumption 3: $\epsilon_1,\dots, \epsilon_J$ are mutually independent and identically distributed. In particular, for $j=1,...,J$, $\epsilon_j$ has a Gumbel distribution with scale 1 and location 0.

Under Assumption 3 (which implies also Assumption 1), we know that $ |\mathbb{E}(\epsilon_{X^*})|<\infty$.

Therefore, I wonder which assumption is actually needed to ensure that $|\mathbb{E}(\epsilon_{X^*})|<\infty$ (while maintaining the full support condition). My prior is that we need some restrictions on the shape of $F$ in the tails. Is this correct? Could you elaborate?

Star
  • 222

3 Answers3

1

Suppose (for symplicity) that $\mathsf{P}(U_i+\epsilon_i=U_j+\epsilon_j)=0$ for any $i\ne j$. Then \begin{align} \mathsf{E}|\epsilon_{X^*}|&=\sum_{j=1}^J \mathsf{E}[|\epsilon_j|1\{X^*=j\}] \\ &\le J\max_{1\le j\le J}\mathsf{E}|\epsilon_j|. \end{align} Therefore, $\mathsf{E}|\epsilon_{X^*}|<\infty$ as long as each $\epsilon_j$ is integrable. On the other hand, if $\mathsf{E}\min_{1\le j\le J}|\epsilon_j|=\infty$, then $$ \mathsf{E}|\epsilon_{X^*}|\ge \mathsf{E}\min_{1\le j\le J}|\epsilon_j|=\infty. $$

  • @TEX Probably the easiest example is the Cauchy distribution. As for the second question, I gave a trivial sufficient condition. I think, this can be further strengthened, e.g, $\mathsf{E}\varepsilon_{X*}^+=\infty$ if $\mathsf{E}\varepsilon_j^{+}=\infty$ for some $j$. –  Sep 30 '21 at 10:51
  • @TEX It is the positive part of $\epsilon_j$ or $\max{\epsilon_j,0}$. You're right it is not "iff". –  Sep 30 '21 at 12:11
  • @TEX Sure. https://math.stackexchange.com/questions/63756/tail-sum-for-expectation –  Sep 30 '21 at 13:36
  • I think I'm confused. You showed (1): $E(|\epsilon_{X^}|)\leq J\max_j \mathbb{E}|\epsilon_j|$; hence, if $\epsilon_j$ is integrable, then $|E(\epsilon_{X^})| \leq E(|\epsilon_{X^}|)\leq J\max_j \mathbb{E}|\epsilon_j|<\infty$, where the first inequality follows from Jensen's inequality (recall that I'm interested in $|E(\epsilon_{X^})|$ ultimately. – Star Sep 30 '21 at 18:21
  • You also showed (2): $E(|\epsilon_{X^}|)\geq \mathbb{E} \min_j |\epsilon_j|$; hence, if $\mathbb{E} \min_j |\epsilon_j|=\infty$, then $E(|\epsilon_{X^}|)\geq \infty$; I'm not sure how to use this to say that $|E(\epsilon_{X^*})|$ can be unbounded. – Star Sep 30 '21 at 18:21
  • Finally, in the comments you mentioned (3): if $E(\epsilon^+j)=\infty$ for some $j$, then $E(|\epsilon^+{X^}|)=\infty$; hence, by Jensen's inequality, $|E(\epsilon^+_{X^})|=\infty$; again, I'm not sure how to use this to say something about $|E(\epsilon_{X^*})|$. – Star Sep 30 '21 at 18:22
  • @TEX Note that $\mathsf{E} \epsilon_{X^}$ may not be defined! That's why I used $\mathsf{E}|\epsilon_{X^}|$ instead. –  Sep 30 '21 at 18:37
  • OK thanks, but I don't know how to use the relations you show to answer my original question – Star Sep 30 '21 at 18:41
0

Question 0 very much depends on the distribution $F$. For example, if $F$ were to be multivariate normal with mean vector 0 and covariance matrix $\Sigma$, then \begin{align*} |\mathbb{E} \epsilon_{X^*}| \le \sum_{j=1}^{J}\mathbb{E}|\epsilon_{j}| = \sqrt{\frac{2}{\pi}}\text{trace}(\Sigma) < \infty \end{align*}

Tom Chen
  • 4,732
  • Thanks, but I'm looking for more generic conditions than referring to a specific parametric family. – Star Sep 24 '21 at 18:32
0

simplest example that shows this is: let's have $X \sim N(0, 1)$ (whatever with support $R$ with bounded mean) and for $k \in \mathbb{R} $, $Y$ with distribution $F_{N(k, 0.01)}/2 + F_{N(-k, 0.01)}/2$ to have continuous distribution and behavior close to $P(Y=k)=P(Y=-k)=0.5$, now $\epsilon=(X,Y)$ it's easy to see that $lim_{k \to \infty} E[\epsilon_{X^*}] \to \infty$, in general we can have this value arbitrary big, if we allowed to have $k=\infty$ then expected value can be $\infty$

quester
  • 607