Since the success of the $p - 1$ algorithm depends on $p - 1$ having "small" prime factors, or at least smaller than a reasonable smoothness bound, and Williams' $p + 1$ method has the same constraint for $p + 1$, it seems to me that it would be extremely unlikely for neither $p - 1$ nor $p + 1$ to be smooth. So if we were to construct, say, a tough 500-bit (RSA-type) semiprime such as
$n = pq = \ $ 2218295966162666629041316944231086113195058068148716022472522769336934503490317077398434252307822130896437753111872862038992461352826985916137012449841
,
where $p$ and $q$ are both "safe" primes to guarantee worst-case performance for the $p - 1$ algorithm (that is, $(p - 1) / 2$ and $(q - 1) / 2$ are both Sophie Germain primes), would it make sense to attempt the $p + 1$ algorithm after $p - 1$ had bailed out? In other words, is there anything to be gained by trying that before ECM or bringing out the "big guns" of QS or NFS?
Oh, and I also generated the above semiprime with $\frac{p}{q}$ large enough to thwart factoring via Fermat's difference of squares method:
$\left|p - \sqrt{n}\,\right| \gt \sqrt[4]{4n}$.