3

I have two poisson-processes, I have seen a mathematical proof that they are independent, and offcourse they must be independent since the proof is in several textbooks. But logically I can not understand why they are independent, for me, it is very logical that they are dependent.

The processes are defined like this:

First we have a Poisson-process $\{N(t), t \ge 0\}$ with parameter $\lambda$. And we defined two new processees like this: At every instance an event occur in the original process, we define this event as type 1, with probability p, and type 2 with probability 1-p. It can then be showed that:

$\{N_1(t),t \ge 0\}$ and $\{N_2(t), t \ge 0\}$ are both poisson processes, and it can be showed that their parameters are $\lambda p$ and $\lambda (1-p)$. But now comes what I don't get, they are also independent!

I mean, lets say you have p=0.5 and you are at point t* in time. Lets also say that you are given that $N_2(t*)=1000$, why does this value of 1000 not change the probability of $P(N_1(t*) \ge 500)$? Isn't it logical that the more type 2 events that happen, it is more likely that a lot of type 1 events have happened?

user119615
  • 10,176
  • 5
  • 44
  • 112

3 Answers3

1

Consider the following two scenarios:

1) You have two independent Poisson processes $N_1(t)$ and $N_2(t)$ with rates $\lambda p$ and $\lambda (1-p)$. $N(t) = N_1(t) + N_2(t)$ is their sum. This is a Poisson process with rate $\lambda$.

2) You have a Poisson process $N(t)$ with rate $\lambda$. Each time an event occurs, a coin is tossed: with probability $p$ the event is assigned type 1, otherwise it is assigned type 1. $N_1(t)$ and $N_2(t)$ are the numbers of type 1 and type 2 events occurring up to time $t$. These are Poisson processes of rates $\lambda p$ and $\lambda (1-p)$ respectively.

Do you see that the two scenarios are describing exactly the same situation?

Robert Israel
  • 448,999
  • 1
    Hello, I see that in both cases we get two Poisson-processess and that they in 1) are independent. But is it obvious that they are independent in 2)? I mean if the coin gives type 1, then we know that at this moment type 2 did not happen. Why is this information not useful? – user119615 Mar 09 '14 at 21:56
  • In scenario (1), with probability $1$ the two processes never have an event at the same time. When an event of the combined process $N(t)$ does occur, it has probability $p$ of coming from process $1$ and $1-p$ of coming from process $2$. The processes in scenario 2 have exactly the same joint distribution as the processes in scenario 1. – Robert Israel Mar 10 '14 at 00:18
1

Hint: We get the independence result if we can show that $$\mathbb{E}\left[\mathrm e^{\mathrm i\left(aN_1(s)+bN_2(t)\right)} \right] = \mathbb{E}\left[\mathrm e^{\mathrm i aN_1(s)} \right]\mathbb{E}\left[\mathrm e^{\mathrm i bN_2(t)} \right], $$ for all $s\leq t$ and $a,b$. Once the case $s=t$ is done, the case $s<t$ is formal, based on conditioning and increment independence properties (of Poisson processes themselves, parent and thinnings, in this case).

Case $s=t$ (I couldn't find a link to it) is based on the typical calculation: $$\mathbb{E}\left[\mathrm e^{\mathrm i\left(aN_1(t)+bN_2(t)\right)} \right]=\sum_{k=0}^{\infty} \mathbb{E}\left[\mathrm e^{\mathrm i\left(aN_1(t)+bN_2(t)\right)} | N(t)=k\right]P\left(N(t)=k\right)$$ $$ =\sum_{k=0}^{\infty} \mathbb{E}\left[\mathrm e^{\mathrm i(a-b)\sum_{i=1}^{k}Z_i}\right]e^{\mathrm i bk}\frac{(\lambda t)^k}{k!}\mathrm e^{-\lambda t}$$ $$= \sum_{k=0}^{\infty} \frac{\left(\left(1-p+p\mathrm e^{\mathrm i(a-b)}\right)\lambda te^{\mathrm i b}\right)^k}{k!}\mathrm e^{-\lambda t}$$ $$= \mathrm e^{\lambda t\left(\left(1-p+p\mathrm e^{\mathrm i(a-b)}\right)\mathrm e^{\mathrm i b}-1\right)}=\mathrm e^{p\lambda t\left(\mathrm e^{\mathrm i a}-1\right)}\mathrm e^{(1-p)\lambda t\left(\mathrm e^{\mathrm i b}-1\right)} =\mathbb{E}\left[\mathrm e^{\mathrm i aN_1(t)} \right]\mathbb{E}\left[\mathrm e^{\mathrm i bN_2(t)} \right],$$ where $Z_i$ are the iid $p$-Bernoulli variables defining the thinning: $$ N_1(t) = \sum_{i=1}^{N(t)} Z_i,$$ $$ N_2(t) = \sum_{i=1}^{N(t)} \left(1-Z_i\right).$$

ir7
  • 6,249
0

The probability of $N_1$ given $N_2$ is the probability of $N_1$ and $N_2$, divided by the probability of $N_2$. Since $N$ is Poisson with mean $\lambda t$, and the probability of $N_2$ given $N$ is binomial with probability $1-p$, the probability of $N_2$ is Poisson with mean $(1-p)\lambda t$. The probability of $N_1$ and $N_2$ is the probability of $N_1$ and $N=N_1+N_2$, or equivalently, the probability of $N=N_1+N_2$ multiplied by the probability of $N_1$ given $N=N_1+N_2$, which is binomial with probability $p$. With some algebra, you can find

$$ P(N_1|N_2)=\frac{P_{Poi}(N_1+N_2;\lambda t)}{P_{Poi}(N_2;(1-p)\lambda t)}P_B(N_1;N_1+N_2,p)={P_{Poi}(N_1;p\lambda t)}=P(N_1) $$

showing $N_1$ is independent of $N_2$. The converse is easily shown by swapping $N_1$ and $N_2$, and $p$ and $1-p$.