4

It is well known that for example in a pure-birth process, explosion occurs when $$ \sum_{n=1}^\infty \lambda_n^{-1} < \infty $$ where $\lambda_n$ is the birth-rate of a new individual when the population size is $n$. For instance, explosion occurs one takes $\lambda_n = c n^2$. While I understand how to prove this mathematically, I'm having trouble understanding this intuitively: no matter how fast the population is growing or how much time passes, in practice the population size should be a finite (albeit huge) number that tends to $\infty$, but how could it ever be equal to $\infty$?

p-value
  • 474
  • 3
    Infinity is not a quantity, merely a convenient expression for something which is unbounded. – jameselmore Dec 31 '18 at 19:29
  • This question may lend some intuition: https://math.stackexchange.com/questions/2136463/when-does-a-pure-birth-process-explode – Math1000 Dec 31 '18 at 20:18
  • 2
    Explosion of the chain $(X_t)$ means that there exists some random time $T$ such that $P(T<\infty)=1$ and $$\lim_{t\to T-}X_t=\infty$$ Thus, $X_t(\omega)$ simply does not exist for $t\geqslant T(\omega)$. – Did Dec 31 '18 at 20:18
  • 1
    @jameselmore Very poetic but is this relevant? My feeling is that it might even reveal a wrong conception of the situation... – Did Dec 31 '18 at 20:20

3 Answers3

3

There is no problem conceptually with the infinity here, if we just think of the result as some mathematical statement about limits. The problem appears when we really want to think of the state of the process as a "population"; that is, when we're modeling a discrete and necessarily finite value. Then infinity has no meaning, so why would we introduce it?

Why use infinity to model finite, discrete processes

We do this all the time. For example, we use real numbers. Real numbers have infinite precision, and all real-world calculations have limited precision (though often, we're not sure exactly what it is). More tellingly, computer calculations have a very well-defined limited precision. Yet we still use real numbers in the error analysis of computer algorithms.

How does this work? Real numbers are a simplified model that avoids ugly details. Maybe we can prove that, with the assumption that our implementation behaves like the real numbers, we can get arbitrarily small error. If this cannot actually happen, what we learn is that our algorithm can produce smaller and smaller error up until we reach the point where real numbers stop being a good model. The goal of further error analysis is to determine when this happens.

The same can happen with a pure-birth process like the one you're considering. Maybe in fact the birth rate $\lambda_n$ is defined by $\lambda_n = n^2$ when $n < N$ (for some large parameter $N$) and $\lambda_n = 0$ otherwise. We can approximate this by the pure-birth process where $\lambda_n = n^2$ for all $n$, and conclude that explosion happens.

What do we learn from this? Well, in our original model explosion was impossible; in the simplified model, explosion happens with probability $1$. The conclusion is that with probability $1$, we eventually reach a state where the simplified model breaks down; in other words, we reach a population of $N$.

So the simplified process that allows infinity allows us to derive finite statements of this form for any "ecological limit" $N$. It is easier to work with, and more general because the conclusion is independent of $N$, and that is the advantage of introducing infinity.

Real-life examples and mathematical ones

You can imagine physical instances of a pure-birth process where we learn something from an "infinite approximation" such as this one. Maybe we really are looking at the population of some lifeform, whose growth follows such a process until ecological limitations start to become a factor. Knowing that the process is explosive guarantees that this will eventually happen.

This can also happen in a more mathematical application. Here is one example. It is not an explosive CTMC, because it is a discrete birth process, but the idea is the same.

Suppose we create a random graph on $n$ vertices by choosing $n$ of the $\binom n2$ possible edges uniformly at random. Then the breadth-first search process from a given vertex initially resembles a Galton-Watson branching process where each individual has $\text{Poisson}(2)$ children.

This is a good approximation when the population of the branching process is small. In this case, there are $\approx n$ potential neighbors for each vertex we look at, and each one is actually a neighbor with probability $\approx \frac2n$. But if the breadth-first search reaches a constant fraction of the vertices in the graph, then the approximation breaks down: there are now significantly fewer than $n$ potential new neighbors, and so the average number of children in the branching process becomes less than $2$.

In this case, the branching process is infinite with some positive probability. This does not directly translate to the vertex we started at being in a component of size $\infty$; that's nonsense, since the graph has only $n$ vertices. It does translate to the vertex being in a component with enough vertices that the approximation breaks down.

Some further analysis tells us that there cannot be two large components in this random graph. Therefore the probability that the Galton-Watson branching process is infinite (a seemingly nonsense probability in our actual setting) tells us a useful quantity: the fraction of vertices in the so-called "giant component" of this random graph.

Misha Lavrov
  • 142,276
2

Intuitively, what's happening is something like the following.

  • A birth occurs at time $t = 1/2$.

  • The next birth occurs at time $t = 3/4$.

  • The next birth occurs at time $t = 7/8$.

  • The next birth occurs at time $t=15/16$.

And so on.

By time 1, how many births have occurred? So what must the total population be?

In the "explosive" case, this sort of thing happens with probability 1. The actual birth times will vary randomly, but there will always be infinitely many births in finite time.

Nate Eldredge
  • 97,710
1

The important thing to keep in mind is that the criteria for an explosive CTMC is that the number of transitions (in this case, births) is infinite within a finite time almost surely.

In that sense, we’re not really making any explicit statement about the population size, but rather about the number of transitions in the system—and only with probabilistic certainty at that!

To demonstrate this, here's an example of an explosive CTMC that always keeps a finite "population". Say you're evolving a random Markovian variable $Z(t) = \left(X(t),Y(t)\right)$ where $X(t) \in \{0,1\}$ denotes the presence of an individual in the system and $Y(t) \in \mathbb{Z}^*$ is counting the number of previous transitions.

If we say that:

$$\mathbb{P}\left[Z(t) = (0,y+1)|Z(0) =(1,y)\right] = \mathbb{P}\left[Z(t) = (1,y+1)|Z(0) =(0,y)\right] = \lambda y^2 e^{-\lambda y^2 t}$$

and $0$ for all other transitions, this chain is demonstrably explosive and will oscillate between $X = 0$ and $X = 1$ infinitely fast in finite time.