30

On a "bottom" disk of area $A$, we place "top" disks of areas $1,\frac12,\frac13,\cdots$ such that the centre of each top disk is an independent uniformly random point on the bottom disk.

Find the maximum value of $A$ such that the bottom disk will be completely covered by the top disks with probability $1$, or show that there is no maximum.

The harmonic series diverges, but the problem here is that the top disks overlap, so it is not clear to me whether a bottom disk of a given area will be completely covered by the top disks, with probability $1$.

I made a desmos graph to help visualise the disks.

(This question was inspired by a question about rain droplets falling on a table.)

Dan
  • 22,158
  • Comments have been moved to chat; please do not continue the discussion here. Before posting a comment below this one, please review the purposes of comments. Comments that do not request clarification or suggest improvements usually belong as an answer, on [meta], or in [chat]. Comments continuing discussion may be removed. – Xander Henderson Dec 01 '23 at 16:27

2 Answers2

12

$\newcommand{\bB}{\mathbf{B}}$ $\newcommand{\PP}{\mathbb{P}}$ $\newcommand{\EE}{\mathbb{E}}$ $\newcommand{\Var}{\text{Var}}$ $\newcommand{\Cov}{\text{Cov}}$

Update: It looks like the case $A > 1$ is essentially contained in Proposition 11.5 of Kahane's book Some Random Series of Functions. The proof given there is much more beautiful, utilizing the second moment method in a very simplistic yet ingenious way.

More specifically, let $S_N$ be the area remaining after placing the first $N$ balls. Then Kahane used the inequality $$\PP(S_N \neq 0) \geq \frac{\EE[S_N]^2}{\EE[S_N^2]}.$$ One can compute that $\EE[S_N]$ grows like $N^{-1/A}$, while $\EE[S_N^2]$ grows like $N^{-1/2A}$. So this shows that $\PP(S_N \neq 0)$ is lower bounded by a constant.


By $\odot(p, r)$, I mean the closed disk centered at $p$ of radius $r$.

I believe we can argue that when $A < 1$ such covering happens with probability $1$, and when $A > 1$ such covering happens with probability less than $1$. I don't have an answer when $A = 1$.

Throughout, let $T_1, T_2, \cdots$ denote the "top disk", with $T_i = \odot(t_i, \frac{1}{\sqrt{\pi i}})$. Let $B$ denote the "bottom disk". Assume $B$ is centered at the origin $0$.


The part when $A < 1$ has a relatively straightforward proof. The idea is to discretize Dan's original answer.

Let $E_N$ denote the event that the first $N$ circles cover the bottom disk. It suffices to show that $$\lim_{N \to \infty} \PP(E_N) = 1.$$ To prove this, we take an optimal $(\sqrt{4\pi N})^{-1}$-net inside the bottom disk, which is defined as the largest set of points $\bB = \{b_1, \cdots, b_k\}$ inside the bottom disk that are at distance at least $\frac{1}{\sqrt{4\pi N}}$ from each other. Here are two facts about such nets

  1. The disks $B_i = \odot(b_i, \frac{1}{\sqrt{4\pi N}})$ cover the entire bottom disk.

  2. The disks $B_i' = \odot(b_i, \frac{1}{\sqrt{16\pi N}})$ are disjoint.

Let $E_{iN}$ denote the event that circle $B_i$ is completely covered. To carry out the analysis below, we use a trick known as dyadic partitioning: for each $0 \leq k \leq \log_2 N$, let $\bB_k$ denote the points in $\bB$ at distance between $[2^{-k-1}, 2^{-k}]$ from the boundary of the base circle, with $\bB_{\log_2 \sqrt{N}}$ also including all the points in $\bB$ at distance less than $\sqrt{N}^{-1}$ from the boundary of the base circle.

Let $b_i$ lies in $\bB_k$. Note that for each $1 \leq j \leq N$, the top disk $T_j$ covers $B_i$ completely iff $t_j$ lies in $C_{ij} = \odot(b_i, \frac{1}{\sqrt{\pi j}} - \frac{1}{\sqrt{4\pi N}})$.

We now need to understand the intersection between $C_{ij}$ and the bottom disk $B$. Note that $C_{ij}$ has radius at most $\frac{1}{\sqrt{\pi j}}$, so when $j$ is larger than a constant, its intersection with the base circle is at least $1/2 - O(j^{-1})$ the area of the whole circle. Furthermore, when $j \geq 2^{2k}$, the entire $C_{ij}$ is contained in the base circle. We can write this as $$\text{Area}(C_{ij} \cap B) \geq \begin{cases} (1/2 - O(j^{-1})) \left(\frac{1}{\sqrt{j}} - \frac{1}{\sqrt{4N}}\right)^2, j < 2^{2k} \\ \left(\frac{1}{\sqrt{j}} - \frac{1}{\sqrt{4N}}\right)^2, j \geq 2^{2k} \end{cases}.$$ So we conclude that $$\mathbb{P}(\overline{E_{iN}}) \leq \prod_{j = 1}^N (1 - \PP(B_i \subset T_j)) = \prod_{j = 1}^N \frac{A - \text{Area}(C_{ij} \cap B)}{A} \leq \exp\left(- A^{-1}\sum_{j = 1}^N \text{Area}(C_{ij} \cap B)\right).$$ We analyze the sum as follows $$\sum_{j = 1}^N \text{Area}(C_{ij} \cap B) \geq \sum_{j = 1}^{2^{2k}} (1/2 - O(j^{-1})) \left(\frac{1}{\sqrt{j}} - \frac{1}{\sqrt{4N}}\right)^2 + \sum_{j = 2^{2k} + 1}^N \left(\frac{1}{\sqrt{j}} - \frac{1}{\sqrt{4N}}\right)^2.$$ We need to understand this asymptotically. Fortunately, it is not too hard to check that $$\sum_{j = 1}^{2^{2k}} (1/2 - O(j^{-1})) \left(\frac{1}{\sqrt{j}} - \frac{1}{\sqrt{4N}}\right)^2 \geq \frac{1}{2} \log(2^{2k}) - O(1).$$ $$\sum_{j = 2^{2k} + 1}^N \left(\frac{1}{\sqrt{j}} - \frac{1}{\sqrt{4N}}\right)^2 \geq \log(N / 2^{2k}) - O(1).$$ So, substituting this back in, we conclude that $$\mathbb{P}(\overline{E_{iN}}) = O\left(\frac{2^{k / A}}{N^{1/A}}\right).$$ We now use the union bound on these events. Using dyadic summation, we need to estimate $$\sum_{b_i \in \bB_k} \mathbb{P}(\overline{E_{iN}}) = O\left(|\bB_k| \frac{2^{k / A}}{N^{1/A}} \right).$$ We can estimate $|\bB_k|$ using the second property of nets above. The circles $B_i' = \odot(b_i, \frac{1}{\sqrt{16\pi N}})$ are disjoint, and they must be contained in a ring of width $O(2^{-k})$ around the boundary of $B$. So we conclude that $$|\bB_k| = O(N 2^{-k}).$$ Thus, we conclude that $$\sum_{b_i \in \bB_k} \mathbb{P}(\overline{E_{iN}}) = O\left(\frac{2^{k(1/A - 1)}}{N^{(1/A - 1)}} \right).$$ Finally, we have $$\mathbb{P}(\overline{E_{N}}) \leq \sum_{k = 0}^{\log_2 \sqrt{N}} \sum_{b_i \in \bB_k} \mathbb{P}(\overline{E_{iN}}) \leq O\left(\sum_{k = 0}^{\log_2 \sqrt{N}} \frac{2^{k(1/A - 1)}}{N^{(1/A - 1)}}\right).$$ We find that each sum inside the big $O$ is $O(N^{-(1/A - 1) / 2})$. So we conclude that $$\mathbb{P}(\overline{E_{N}}) = O(\log N \cdot N^{-(1/A - 1) / 2}).$$ Thus $$\lim_{N \to \infty} \mathbb{P}(\overline{E_{N}}) = 0$$ as desired.


The part when $A > 1$ is more difficult. My idea is to show that with nonzero probability, after we have placed disks $T_1,T_2,\cdots,T_N$, the uncovered region contains many disjoint, microscopic disks.

To make the rigorous, let $K = 10^{10}$ and $Q = K^{A / (A - 1)}$. We consider the following event:

$E_t$: For each $Q \leq s \leq t$ the following holds. After we have placed $T_1, \cdots, T_{Q^s}$, we can find $2^s$ closed disks of area $Q^{-s}$ inside $\odot(0, 0.1)$, such that all of them are completely uncovered, and each pair of these disks are at a distance at least $2\pi^{-1/2}Q^{-s/2}$ apart from each other. Furthermore, if $U_s$ denotes the union of these disks, then $U_s \subset U_{s - 1}$.

Then my main observation is

Lemma: Assume $t \geq Q$. Condition on the placement of disk $T_1, \cdots, T_{Q^t}$, and suppose $E_t$ happens. Then the probability that $E_{t + 1}$ happens is at least $1 - \frac{Q}{2^t}$.

Proof: The main method we use to prove this is called the second moment method.

Let $B_1, \cdots, B_{2^t}$ be the $2^t$ disks of area $Q^{-t}$ inside $\odot(0, 0.1)$, such that all of them are uncovered, and each pair of these disks are at a distance at least $2\pi^{-1/2}Q^{-t/2}$ apart from each other.

It is not hard to show that, in each $B_i$, we can fit at least $R = \lceil{Q / 100\rceil}$ disks $B_{i1}, \cdots, B_{iR}$ inside $B_i$, such that they have area $Q^{-t-1}$ each and are distance at least $2\pi^{-1/2}Q^{-(t + 1)/2}$ apart from each other. Let $b_{ij}$ be the center of $B_{ij}$. Let $I_{ij}$ be $1$ if none of the disks $T_{Q^t + 1}, \cdots, T_{Q^{t + 1}}$ touch $B_{ij}$, and $0$ otherwise.

We first compute the expectation of $I_{ij}$. Note that for each $k \in [Q^t + 1,Q^{t + 1}]$, $T_{k}$ touches $B_{ij}$ if and only if $t_k$ lies in a circle $C_{ijk}$ of radius $\pi^{-1/2}(k^{-1/2} + Q^{-(t + 1) / 2})$ centered at $b_{ij}$. Note that we assumed that $B_{ij}$ are all far away from the boundary of $B$. So we have $$\PP(T_k \text{ touch }B_{ij}) = \frac{1}{A} \left(k^{-1/2} + Q^{-(t + 1) / 2}\right)^2.$$ Thus, we have $$\EE[I_{ij}] = \prod_{k = Q^t + 1}^{Q^{t + 1}}\left(1 - \frac{1}{A} \left(k^{-1/2} + Q^{-(t + 1) / 2}\right)^2\right).$$ Before we simplify this, we compute the covariance of $I_{ij}$ and $I_{i'j'}$ when $i \neq i'$. Note that by the separation condition on $B_i$ and $B_{i'}$, the circles $C_{ijk}$ and $C_{i'j'k}$ are disjoint. So we have $$\PP(T_k \text{ touch }B_{ij}\text{ or }B_{i'j'}) = \frac{2}{A} \left(k^{-1/2} + Q^{-(t + 1) / 2}\right)^2.$$ Thus we have $$\EE[I_{ij} I_{i'j'}] = \prod_{k = Q^t + 1}^{Q^{t + 1}}\left(1 - \frac{2}{A} \left(k^{-1/2} + Q^{-(t + 1) / 2}\right)^2\right).$$ The crucial observation is that we can, using the relation $1 - 2x \leq (1 - x)^2$, compute that $$\EE[I_{ij} I_{i'j'}] \leq \EE[I_{ij}] \EE[I_{i'j'}].$$ So we conclude that $$\text{Cov}(I_{ij}, I_{i'j'}) \leq 0.$$ In other words, if a $T_k$ does not cover $I_{ij}$, it is more likely to cover $I_{i'j'}$. This is crucially what makes the second moment argument work! We can now estimate the expectation $\EE[I_{ij}]$. Recall $$\EE[I_{ij}] = \prod_{k = Q^t + 1}^{Q^{t + 1}}\left(1 - \frac{1}{A} \left(k^{-1/2} + Q^{-(t + 1) / 2}\right)^2\right).$$ Note $\left(k^{-1/2} + Q^{-(t + 1) / 2}\right)^2 = k^{-1} + 2 k^{-1/2}Q^{-(t + 1) / 2} + Q^{-(t + 1)}$, so $$\EE[I_{ij}] \geq \prod_{k = Q^t + 1}^{Q^{t + 1}}\left(1 - \frac{1}{A}(k^{-1} + 2 k^{-1/2}Q^{-(t + 1) / 2} + Q^{-(t + 1)})\right).$$ Using the estimate $1 - x \geq e^{-x-x^2}$ when $x \leq 1/2$, we get $$\EE[I_{ij}] \geq \exp\left(-\sum_{k = Q^t + 1}^{Q^{t + 1}}\frac{1}{A}(k^{-1} + 2 k^{-1/2}Q^{-(t + 1) / 2} + Q^{-(t + 1)}) + 4k^{-2}\right).$$ Now using familiar results about Harmonic series, we conclude that $$\EE[I_{ij}] \geq \exp\left(-\frac{1}{A} \log Q - 10\right) = e^{-10} Q^{-1/A}.$$ Now, let $$X = \sum_{i, j} I_{ij}.$$ There are $R 2^i$ terms in this sum. By linearity of expectations, we have $$\EE[X] \geq e^{-10} Q^{-1/A} R 2^t \geq 100^{-1} e^{-10} Q^{1-1/A} 2^t \geq 2^{t + 2}.$$ We note that $$\Var[X] = \sum_{i,j,i',j'} \Cov[I_{ij}, I_{i'j'}].$$ The vast majority of terms in this sum has $i \neq i'$! We have $$\sum_{i,j,j'} \Cov[I_{ij}, I_{ij'}] \leq \sum_{i,j,j'} \EE[I_{ij}] \leq R \EE[X].$$ And $$\sum_{i,j,i',j': i\neq i'} \Cov[I_{ij}, I_{i'j'}] \leq 0.$$ So we have $$\Var[X] \leq R \EE[X].$$ Our hard work has finally paid off! By Chebyshev's inequality, recalling that $\EE[X] \geq 2^{t + 2}$, we have $$\PP[X < 2^{t + 1}] \leq \frac{\Var[X]}{(\EE[X] - 2^{t + 1})^2} \leq \frac{4R\EE[X]}{\EE[X]^2} \leq \frac{4R}{\EE[X]} \leq \frac{4R}{2^t}.$$ If $X \geq 2^{t + 1}$, then $E_{t + 1}$ happens, as desired. We have completed the proof of the lemma.

Note that $E_Q$ happens with non-zero probability, since it happens whenever the first $Q^Q$ circles all lie in a semi-circle of $B$. The lemma tells us that $$\PP(E_{i + 1}) \geq \PP(E_i) \cdot \left(1 - \frac{Q}{2^i}\right).$$ So telescoping gives $$\PP(E_i) \geq \PP(E_Q) \cdot \prod_{Q \leq j \leq i} \left(1 - \frac{Q}{2^j}\right).$$ Thus we have $$\PP(\cap_{i = Q}^\infty E_i) \geq \PP(E_Q) \cdot \prod_{j = Q}^\infty \left(1 - \frac{Q}{2^j}\right) > 0.$$ Finally, if $\cap_{i = Q}^\infty E_i$ happens, then the circle is not covered (thanks to Cantor's intersection theorem). So the circle is not covered with non-zero probability as desired.

abacaba
  • 8,375
  • What is the probability of covering when $A=1$? – Dan Nov 20 '23 at 03:50
  • 3
    I actually don't know... This kind of boundary question is often hard in probability. – abacaba Nov 20 '23 at 03:51
  • 1
    I just read the answer of the raindrop problem linked in the question description. I believe the method in that answer can be used for the case $A < 1$, but I don't know how to do $A \geq 1$ with only the method there. – abacaba Nov 20 '23 at 05:39
  • When you say "and when A>1 such covering happens with probability less than 1" are you including A = $\infty$? – KDP Nov 20 '23 at 10:33
  • When $A = \infty$ the probability model is not well-defined... – abacaba Nov 20 '23 at 16:48
  • @Dan do you want to ask $A = 1$ as a separate question? If you don't, I might ask it myself. – abacaba Nov 20 '23 at 18:13
  • @abacaba Yes, I will ask about $A=1$ as a separate question, probably on Mathoverflow (seeing your answer here, I better understand how difficult this question is). Would it be more convenient if $A=\pi$ (and the top disks $\pi,\frac{\pi}{2},\frac{\pi}{3},\dots$)? Or should I first ask about the one dimensional case (line segment of length $1$ under randomly placed line segments of length $1,\frac12,\frac13,\dots$)? – Dan Nov 20 '23 at 21:19
  • 1
    I think $A = \pi$ would indeed simplify some things slightly. The one-dimensional case should have a solution that is analogous to what I wrote here, so it might be less interesting. – abacaba Nov 20 '23 at 21:48
  • 1
    I posted a question about the boundary case on MO. – Dan Nov 20 '23 at 22:11
  • @abacaba If the probability of covering at $A=1$ is not $1$, then the probability is not a continuous function of $A$, right? I find that hard to grasp. Are there any simpler examples of a non-continuous probability (but based on a continuous variable such as area)? – Dan Nov 21 '23 at 03:07
  • @abacaba I am currently unable to access the chat room (some sites are blocked here in China). If you're up for a quick reply to my previous comment, fine, otherwise no worries :) – Dan Nov 21 '23 at 03:52
  • 2
    abacaba 19:30 Here is one that immediately comes to my mind. Consider the probability that a Brownian motion $B_t$ eventually hits the function $f(t) = t^A \log^2 (t + 1) + 100$. By the log of iterated logarithms, when $A < 0.5$ this probability is $1$, while when $A = 0.5$ this probability is significantly less than $1$. – abacaba Nov 21 '23 at 04:03
1

EDIT: The second to last line in my answer is flawed, as pointed out by @Dominik Kutek.


Consider a fixed point on the bottom disk. The probability that it is covered by the top disk of area $\frac{1}{k}$, is at least $\frac{1}{2kA}$. (The $2$ is there because the fixed point might be near the edge of the bottom disk.)

So the probability that the fixed point is not covered by the top disk of area $\frac{1}{k}$, is less than or equal to $1-\frac{1}{2kA}$.

So the probability that the fixed point is not covered by any of the top disks, is less than or equal to

$$\prod\limits_{k=1}^\infty\left(1-\frac{1}{2kA}\right)=\exp \sum\limits_{k=1}^\infty \ln \left(1-\frac{1}{2kA}\right)\le \exp \sum\limits_{k=1}^\infty\left(-\frac{1}{2kA}\right)=0$$

So for any value of $A$, every fixed point on the bottom disk will be covered with probability $1$.

So for any value of $A$, the probability that there will be an uncovered region of positive area, is $0$.

So for any value of $A$, the probability that the bottom disk will be completely covered, is $1$.

Dan
  • 22,158
  • 1
    This is the correct solution – Severus' Constant Nov 19 '23 at 08:49
  • Can you mark yourself as accepted answer? :) – Severus' Constant Nov 19 '23 at 08:54
  • I think I have found a counter example. Stand by – KDP Nov 19 '23 at 11:02
  • 1
    for clarify, you can show that $ln(1-x) = -x - \frac{x^2}{2} - \frac{x^3}{3} ... \le -x$ – Severus' Constant Nov 19 '23 at 11:55
  • @Severus'Constant Yes, or consider the graphs of $y=\ln (1-x)$ and $y=-x$. – Dan Nov 19 '23 at 11:57
  • "So for any value of A, every fixed point on the bottom disk will be covered with probability 1" does not imply "So for any value of $A$, the probability that there will be an uncovered region of positive area, is 0". You can only take at most countable intersections to preserve probability 1, whereas here you took an uncountable one. – Presage Nov 19 '23 at 13:37
  • @DominikKutek My reasoning is as follows: If there is an uncovered region of positive area, this is incompatible with the assertion that every fixed point on the bottom disk will be covered with probability $1$. (Is there a simple counter-example that would show that this way of thinking is flawed?) – Dan Nov 19 '23 at 13:47
  • 4
    You are saying that the set $A_x - $ "point $x$ will be covered" has probability $1$, for any fixed point $x$. Now, you want to say that some region (so some set $C$ inside your disk) has probability $0$ of being uncovered. The last event is the set $\bigcup_{x \in C} A_i^c$. This is an uncountable sum of events of measure $0$, which need not have still measure $0$ (only at most countable sums of measure $0$ are still of measure $0$). For a trivial example note that on $[0,1]$ with Lebesgue measure, every singleton ${x}$ is of measure $0$, but $\bigcup_{x \in (0,\frac{1}{2})}{x}$ is not. – Presage Nov 19 '23 at 14:02
  • @DominikKutek Thanks. I will edit to make a note of this flaw in my answer. – Dan Nov 19 '23 at 14:11