Whose pages are you scraping, and for what reason?
You're talking about roughly thirty page requests per second, or 2 MB/s. You say it's not saturating the bandwidth of "the" server - is that your server or theirs? Saturating the bandwidth of somebody else's server is generally considered to be a denial of service attack.
Two megabytes a second is a lot of somebody else's bandwidth to be using. I couldn't do that from home, as my broadband connection is simply not that fast. Some web hosting services have bandwidth caps, meaning you'd be shutting people's sites down for a certain period of time.
If the pages you're scraping belong to somebody big enough not to mind this, check to see if they've got an API you can use instead. If they don't, consider cutting back so you're not perceived as an attacker.