s3cmd
s3cmd copied to clipboard
Autoset chunk size if file is bigger than 150000 MB
Trying to put a 900 GB file, I had this error after 2 hours of md5 check: ERROR: Parameter problem: Chunk size 15 MB results in more than 10000 chunks. Please increase --multipart-chunk-size-mb Maybe s3cmd could check the file size before checking md5?
Update: It seems that files should not exceed 1000 parts. ERROR: S3 error: 400 (InvalidArgument): Part number must be an integer between 1 and 1000, inclusive
I had exactly the same problem last night! I also think that it would be great if the program performed a chunk size check before calculating the checksum! And it would be even better if there was some heuristics mode for calculating the correct chunk size. As for the maximum number of chunks, it is 10000, not 1000: https://docs.aws.amazon.com/AmazonS3/latest/userguide/mpuoverview.html