I once implemented this for MMOs.
One weekend while attempting to install WOW for my daughter took 48 elapsed hours, (patches, download errors, etc.) so I decided to make my own better solution.
The game usually needs say 10 GB of data before it will run. Not all files are actually needed right away, but games used to wait until all files were locally present. My solution was to have the game run locally as normal, but the EXE was tricked (using a Windows file minifilter) into thinking all files were already present. When a requested file was not local, the file system downloaded it and saved it. The game was slowly copied locally as needed. When there was available bandwidth, the other not yet needed files were trickled in in the background.
This worked with ALL games without the need to recompile, because my minifilter driver got files when needed.
The worst drawback was latency. My solution to that was to create a small Markov Chain model to predict which file might be needed next, and prioritizing the background loader. This worked like a charm, and our MMO was able to run almost IMMEDIATELY after just the EXE and a few loading screen files were local (~20 MB). We were able to click a link on a webpage and run our 10+ GB game in about 30 seconds. We had a 99% hit rate, meaning when the game needed a new file for the first time it was already there!
I'd be happy to help anyone else implement this.