Frank Wessels
Frank Wessels
That would be great 👍
Thanks for the bug, I will look into that. Rolling hash deduplication is not supported at the moment, here is a pointer at a package (https://github.com/restic/chunker) that should not be...
Below are the results of a mock-up test that prepends data at the beginning in order to see the effects on the BLAKE2 Tree mode: ``` franks-mbp:rolling-hash frankw$ ./rolling-hash <...
@vsivsi Great to hear that you found something that fits your needs. Restic is a nice project which is actively being developed, and give our regards to @fd0.
Great idea, we'll do that. Do you have any pointers how best to do this?
Thanks, we will look into it
Yes, s3git can create really large repos. I think what you may be looking for is the following: it is possible to do an `s3git snapshot checkout --dedupe` (after cloning...
So you mean like a filter for some sort for 'sparse' checkouts such as described in http://jasonkarns.com/blog/subdirectory-checkouts-with-git-sparse-checkout/ ? Something like this shouldn't be too difficult. Note that there is also...
That is a good point, there are no plans at the moment to add compression after deduping the data into chunks. I guess that for some content such as text...
@klauspost tak! Do you mean you store the content as 'stored', as in (I believe) the mode '0' of the zip format? And you detect whether or not to compress...