The Poisson is derived from the Binomial distribution, when one takes the limit of n, i.e. the number of experiments, to infinity, while demanding that the expected value stays the same (essentially reducing the probability to 1/n).
Intuitively, it means that there is no confined number of dichotomic (1 or 0) events, but possibly infinite amount of events.
This page explains it nicely (cached version):
Denote E(x) of the binomial distribution, np, by λ. Note: $p=\frac{\lambda}{n}$
If you develop the Binomial distribution formula, you arrive finally to this:
$P(x) = \frac{n}{n} \cdot \frac{n-1}{n} \cdots \frac{n-x+1}{n} \cdot \frac{\lambda^x}{x!}\left( 1 - \frac{\lambda}{n} \right)^n \left( 1 - \frac{\lambda}{n} \right)^{-x}$
If you take the limit of this expression when n goes to $\infty$, all the first x expressions reduce to 1, and the last expression as well; the second to last becomes $e^{-\lambda}$, and you are left with the Poisson distribution:
$\lim_{n \rightarrow \infty} P(x) = \frac{e^{-\lambda} \lambda^x}{x!}$
Edit: I'm not sure I understand the comments made to this answer. I try to write it more "rigorous-ly":
Suppose I have a series of R.V., s.t. $X_i\sim Bin(n, \frac{\lambda}{n})$. We see that under this setup, the expected value = $n \cdot \frac{\lambda}{n} = \lambda$ stays the same. If I then look at the PMF of this R.V. in the limit, I get that:
$$ \lim P_{X_n}(k) = \lim {n \choose k}(\frac{\lambda}{n})^k(1-\frac{\lambda}{n})^{n-k}=\frac{\lambda^k}{k!} \lim \frac{n!}{(n-k)!n^k} (1-\frac{\lambda}{n})^n (1-\frac{\lambda}{n})^{-k}
$$
The 1st and 3rd terms go to 1. The 2nd term goes to $e^{-\lambda}$. And we get exactly the Poisson PMF.
Hope this clears it up for you guys.