s4cmd
s4cmd copied to clipboard
Sync command re-downloads existing files
Hey everyone!
s4cmd
looks really cool, I love how fast the s4cmd du
command scales with huge buckets.
Well done!
I did notice that s4cmd sync s3://... /local/directory
actually re-downloads files that are already present in the local destination directory -- any way to prevent that?
Thanks, Elad
The first run took 6 hours on a large bucket. Is there any tool that will only download files that differ in time and or timestamp?
Yes! awscli
works that way. I wrote a blog post about it:
https://eladnava.com/backing-up-your-amazon-s3-buckets-to-ec2/
I believe this is fixed, the current documentation provides a flag for skipping files:
-s/--sync-check: check md5 hash to avoid syncing the same content.
Probably can be closed.