The quoted recommendation is generally considered obsolete in the context of RSA with secure parameters, and is either disregarded, or replaced by asking that $\left|p–q\right|>2^{(n/2)–100}$ where $n$ is the number of binary digits for $N=pq$. This modern rule was in ANSI X9.31 (1998), and is still in FIPS 186-4 (2013), appendix B.3, criteria 2(d).
This quoted recommendation, and its modern counterpart, are intended to make $p$ and $q$ different enough to guard against Fermat factorization, and improvements thereof. The basic Fermat factorization technique enumerates integers $b$ from $0$ onward, stopping when $N+b^2$ is a square $a^2$, revealing $p$ and $q$ as $(a+b)/2$ and $(a-b)/2$ (within order). Numerous improvements exist allowing to skip some $b$, and reducing the cost of the squareness test.
However, with $p$ and $q$ of equal size and $n\ge512$, but $p$ and $q$ otherwise mostly random, all known improvements of Fermat factoring have fully negligible odds to succeed (because the number of $b$ to enumerate by the basic Fermat factoring method is $\left|p–q\right|/2$, which is higher than $2^{212}$ with odds better than $1-2^{-40}$, and none of the known improvements lowers the $2^{212}$ steps to something workable). Notice that $n=512$ is so small that it provides only very limited security (see this), and that the pointlessness of trying Fermat factorization (compared to other factorization techniques) grows when $n$ grows.
The quoted recommendation only makes sense for toy examples of RSA, say when $p$ and $q$ both are about 9-10 decimal digits (thus $n\approx64$). And then a few digits can safely be taken as 2 digits; or even (in decimal) reduced to a single digit, as long as the leading digit of the largest prime is not 1.
Rules in ANSI X9.31 and FIPS 186 prescribe (in the interest of interoperability of devices using RSA private keys they did not generate) that $p$ and $q$ are primes in range $[2^{(n-1)/2}\dots2^{n/2}]$. This implies $p$ and $q$ have the same number $\lceil n/2\rceil$ of binary digits (and that their number of decimal digits differs by at most one, if at all). This goes straight against the wording of the quoted recommendation. According to my memories of Robert Silverman's account: the requirement that $\left|p–q\right|>2^{(n/2)–100}$ (for $n\ge1024$) was introduced (although technically not needed) to please the ANSI X9.31 standard's committee, sponsored by bankers, who wanted a simple rebuttal to a court argument on the tune of: "I claim that my client did not produce that signature! One simple explanation is that the modulus has been factored. An expert has testified that Fermat factoring, known since the 17th century, potentially could do that, and allow such forged signature. No precaution against it was taken! Whoever carelessly specified that signature system must bear the consequences!".
p - q ≈ p - p/1000 ≈ .999 p ≈ p ≈ sqrt(N) >> N^(1/4)
. Hence Fermat factorization based on p being near q won't work when p and q differ in length. That said just randomly picking two n/2 bit primes will with high probability result in them being O(sqrt(N)) apart. – dr jimbob May 05 '16 at 04:13