HM Edward's book is pretty good. I'm fairly sure you are referring to his remarks on pg 2.
He is making some general hisotric remarks about how we made educated guesses about what the correct statement of Prime Number Theorem (PNT) is. If you think of $d\mu_{p}$ as the point measure which assigns $\frac{1}{p}$ at primes $p$ and $0$s to everything else, then you can say that
$$\sum_{p \le x}\frac{1}{p} = \int_{1}^{x}\,d\mu_{p}.$$
This integral is like intgerating the density $d\mu_{p}$ over the volume $[1,x]$ to get the total mass $\sum_{p \le x}\frac{1}{p}$. The problem is that we don't have a nice expression for $d\mu_{p}$ in terms of $x$! How sad...
But we can guess what it might be. The fact that (2') says
$$\sum_{p \le x}\frac{1}{p} \sim \log\log x$$
means that the ratio of the two sides tends to $1$ for large $x$ so that the relalative error between $\sum_{p \le x}\frac{1}{p}$ and $\log\log x$ is small. But as Edwards notes,
$$\log\log x = \int_{1}^{\log x}\frac{du}{u} = \int_{e}^{x}\frac{1}{v}\frac{dv}{\log v},$$
and putting these together we get
$$\sum_{p \le x}\frac{1}{p} \sim \int_{e}^{x}\frac{1}{v}\frac{dv}{\log v}.$$
This last integral is sort of like integrating the density over the volume. Our volume is the interval $[e,x]$ (large $x$) and the "$\sim$" means that the two masses on either side are approximately the same. In this case though we have weighted both sides, $\frac{1}{p}$ on the LHS and $\frac{1}{v}$ on the RHS. If you clear the weights (this isn't formally allowed) you get
$$\pi(x) = \sum_{p \le x}1 \sim \int_{e}^{x}\frac{1}{\log{v}}\,dv,$$
so that the density of the primes should be something like $\frac{1}{\log{x}}$ in this interval. This is good because it's in terms of $x$! Of course, this turns out to be true (this is PNT), but it takes quite a bit of work to prove. In addition, it's worth noting that the logarthmic integral
$$Li(x) = \int_{2}^{x}\frac{1}{\log t}\,dt,$$
is an extremely good approxiation to the prime counting function $\pi(x)$. It's a reuslt of Poussin tht
$$\pi(x)-Li(x) = O(xe^{-a\log x}),$$
for some constant $a$ so that these educated guesses do turn out to be the truth.