My theory
take a file examplesize:(3GB)
convert said file to a number
divide that number by a large prime number example:(2^82,589,932 · (2^82,589,933-1))[2018 Dec 07]
get the quotient and the remainder
if the remainder is still large do the math again to get another quotient and remainder until the remainder size is as small as preferred
build the compressed data
FFFF - First division EEEE - Second division .... LLLL - Final remainder length(hex) RRRR - Final remainder ZZZZ - File Name Example: 0000ABC60000EF53....000468438FB1412e62696e Example: FFFFFFFFEEEEEEEE....LLLLRRRRRRRRZZZZZZZZZZ
Because there are only 51 confirmed primes larger then 2 (as of 2018 Dec 07) and we would only need the longest 20 primes to shrink a file to the max useful size we would only need 80 bytes plus the remainder and file name.
For a (by my calculations)108PB file this could bring the file size to a few Kilobytes in size at most
Does this even sound feasible