4

As the mere size of the constantly changing data file tampers with my backup concept, I would like to ask if there is a way to split this file into many smaller pieces. I have no problem with the size as such, but with the amount of changes.

Background is, I do my backups with rsnapshot. This links files which haven't been changed on the target together, while changed files are copied in full.

That's why I would imagine to split the currently about 763 MB big blk0001.dat into two files:

  • blk0001.dat having maybe 700 MB, and not changing (frequently) in the future.
  • blk0002.dat holding the rest.

As the 2nd file grows steadily, I would, after it has reached a certain size, put more data from it into blk0001.dat, and eventually stop changing this file at all, starting a 3rd one.

Has someone done this already?

glglgl
  • 143
  • 7

3 Answers3

5

This is actually already implemented: blk0001.dat contains the first two gigabytes of blockchain data, blk0002.dat contains the second two gigabytes, and so on. We just haven't hit two gigabytes of data yet.

Pieter Wuille
  • 105,497
  • 9
  • 194
  • 308
  • one gigabyte is much stuff... – glglgl Nov 22 '11 at 17:49
  • 1
    @glglgl You can request a feature for the Main Client to offer variable size of .dat dumps here: https://github.com/bitcoin/bitcoin , although I don't think it will be a high priority, unless it turns out that a lot of other people want the same. – ThePiachu Nov 22 '11 at 21:34
2

If you want custom file handling, you'll need to alter the client code, or write some application that would be reading from the dat file and splitting it accordingly. I haven't heard of any existing offline solutions for alternative handling of the files.

ThePiachu
  • 43,091
  • 25
  • 139
  • 348
2

From https://github.com/bitcoin/bitcoin/issues/1400#issuecomment-6028443, Gregory Maxwell explains that blk0001.dat will contain closer to 2GB of blockchain data. He also explained that a gambling site and a miner with an inefficient payout system are responsible for the fast increase in file size.

Dave Scotese
  • 803
  • 6
  • 19