0

Suppose we have a list $(X_1, \dots, X_n)$ of continuous random variables with support $[0,1]$ (not necessarily independent) .

I am wondering whether, as $t \rightarrow 0$, the probability of

$$\left[\prod_{i=1}^n X_i \leq t\right] \qquad (a)$$

tends towards the probability of

$$\left[X_i \leq t^{1/n} \text{ for all } i \in \{1,\dots, n\}\right]. \qquad (b)$$

So just to be clear, my question is whether

$$\lim_{t\rightarrow 0} \frac{P\left(\prod\limits_{i=1}^n X_i \leq t\right)}{P\left(X_i \leq t^{1/n} \text{ for all } i \in \{1,\dots, n\}\right)} = 1. \qquad (q)$$

What makes me think the probabilities of (a) and (b) may be identical in the limit (in the sense of (q)) is that if we have (a) but not (b), then we must have

$$[X_i \leq t^{1/(n-1)} \text{ for at least } (n-1) \text{ of the } i \in \{1,\dots, n\}]. \qquad (c)$$

Intuitively, it seems that, as $t$ approaches $0$, the probability that (c) is true goes to $0$ much faster than the probability that (b) is true, which would give (b) a dominating first-order effect in the limit. But maybe my intuition is misguided and even if it isn't, I am struggling to spell this out and prove it formally.

(If (q) only held under "minor" regularity condition on the joint distribution of the $X_i$'s I wouldn't mind learning about this either :))

StubbornAtom
  • 17,052
FZS
  • 226
  • 1
    Both probabilities go to 0. And not necessarily at the same rate, the probability of event B might be exactly 0 for $0<t<1/2$. – Michael Jul 14 '20 at 01:53
  • @Michael: Thanks for your comment. I have tried to clarify my question a little bit (equation (q)). I was aware that the two probabilities ((a) and (b)) would tend to $0$ but I did think they would do so at the same rate. If you have time, I think it would help my understanding (and answer the question) to see an example where (q) is false. – FZS Jul 14 '20 at 12:36
  • 1
    Take $n=2$, $X_1 \sim Unif([0,1])$, $X_2 \sim Unif([1/2,1])$, $X_1, X_2$ independent. The denominator in q is zero for small $t$. Or take $X_1 \sim Unif([0,1])$ and $X_2=1-X_1$. Then again the denominator in q is zero for small $t$. – Michael Jul 14 '20 at 14:36
  • (By the way, do you have any examples where q is true?) – Michael Jul 14 '20 at 14:42
  • Thanks for following up on this. I understand your initial comment better now. Of course, it's easy to find counter-examples to (q) if the support of one of the RVs excludes some interval $[0,\epsilon]$. This is the reason I indicated the RVs in the list must have support $[0,1]$ (although I admitedly should have made this more visible and stress more excplicitly how important that restriction was). – FZS Jul 14 '20 at 14:48
  • Fair point, before making wild conjectures based on some vague intuition, I should make sure I have at least one example to back up the conjecture. Let me think about it and I'll get back to you. Maybe I'll find the desired counter-example myself that way. – FZS Jul 14 '20 at 14:49
  • 1
    My second example had $X_1$ and $X_2$ both uniform over $[0,1]$, but they were not independent ($X_2=1-X_1$) so they cannot both be small. – Michael Jul 14 '20 at 14:53
  • Sorry, was too focused on the first example and did not consider the second. I have nothing to say about that one. It's very helpful. Helps me realize I need to impose (many) more conditions on the joint if I hope to make (q) true. If you want to turn your comment into an answer, I'd be happy to accept it. – FZS Jul 14 '20 at 15:05
  • 1
    I'll turn my comment into an answer if you compute your value of q for the case $X_1, X_2$ iid uniform over $[0,1]$. =) In general I think your conjecture has to do with "large deviation theory" but that is not quite posed in the way you have done. – Michael Jul 14 '20 at 15:22
  • Fair deal ;) I actually started working on this following your sugggestion that I find a positive example. If I can do that in a reasonable amount of time, I'll add it to the question. Thanks again for your help with all this. – FZS Jul 14 '20 at 15:38
  • @Michael: Should be done with the case of two independent uniform. As you can see below, this also yields a counter-example (something I did not anticipate). Thanks again for your help and encouragements. – FZS Jul 14 '20 at 16:20

2 Answers2

2

The case $U_1, U_2$ are i.i.d. uniform over $[0,1]$ should be given best answer. This just summarizes my earlier comments, with a minor note on large deviation theory:

Some counterexamples for $n=2$

  1. Let $X_1 \sim Unif([0,1])$ and $X_2 = 1-X_1$ (dependent). Then both $X_1$ and $X_2$ are uniform but $X_1$ and $X_2$ cannot both be small and so $P[X_1\leq t^{1/2}, X_2\leq t^{1/2}]=0$ for $0<t<1/4$.

  2. Let $X_1 \sim Unif([0,1])$ and $X_2 \sim Unif([1/2,1])$. Then $P[X_1\leq t^{1/2}, X_2 \leq t^{1/2}]=0$ if $0<t< 1/4$.

Some possible relation to large deviation theory

A minor comment: Let $X, Y$ be independent. Fix $t \in \mathbb{R}$. Then for all $x \in \mathbb{R}$ we have $$ \{X > x, Y>t-x\} \subseteq \{X+Y>t\}$$ so $$P[X>x]P[Y>t-x] \leq P[X+Y>t] $$ This holds for all $x \in \mathbb{R}$ so $$ \sup_{x \in \mathbb{R}} P[X>x]P[Y>t-x] \leq P[X+Y>t]$$ On the other hand, assuming moment generating functions exist, we have by the Chernov bound (for all $r>0)$: $$ P[X+Y>t]\leq E[e^{rX}]E[e^{rY}]e^{-rt}$$ So $$ \sup_{x \in \mathbb{R}} P[X>x] P[Y>t-x]\leq P[X+Y>t]\leq \inf_{r>0}E[e^{rX}]E[e^{rY}]e^{-rt}$$ and I think the upper and lower bounds can be close for large $t$.

Michael
  • 23,905
1

In the comments, @Michael has a very simple counter-example to (q) that, I believe, will soon be turned into an answer. That example relies on dependent RVs. As it turns out, the conjecture is not true for independent RVs either. In fact, the conjecture is not even true in the simple case where $X_1$ and $X_2$ are independent $U[0,1]$, something I really should have checked before posting my question.

Suppose $X_1$ and $X_2$ are independent $U[0,1]$. Then,

Overall, we have

$$ \frac{P(X_1*X_2 \leq t)}{P(X_1 \text{ and } X_2 \leq t^{1/2})} = \frac{t - t\log(t)}{t} = 1-\log(t),$$

which converges to $\infty$ (rather than $1$) as $t \rightarrow 0$ (i.e., in this simple case, $P(X_1*X_2 \leq t)$ converges to zero "significantly" faster than $P(X_1,X_2 \leq t^{1/2})$).

FZS
  • 226