backwpup
backwpup copied to clipboard
fix S3 multipart uploads failing on large files
Amazon S3 backups fail in BackWPup v3.6.10 - v3.8.0 when the backup file is very large (e.g. gigabytes). Previously logged on wp.org support forum.
A sample error looks like this:
PHP Fatal error: Allowed memory size of 1073741824 bytes exhausted (tried to allocate 5242912 bytes) in wp-content/plugins/backwpup/inc/class-destination-s3.php on line 773
This PR fixes the problem by replacing the multipart code with the recommended AWS SDK code using the MultipartUploader class to manage the upload.
@webaware Thank you for the PR. I really appreciate it ❤️ About the issue, I think this will be fixed by override class-destination-s3.php in backwpup/inc/ with the one here: https://gist.github.com/cuongdcdev/6a751d4312f8e6ac056294ee638cb71b Could you please help me test the path? Thank you so much! 🙏
G'day Cuong, yes the forced garbage collection does resolve the problem -- memory stays at a stable level during the upload. Memory usage is a little more than using AWS' MultipartUploader, and also is doing more work by extracting temporary files to upload, so AWS' MultipartUploader is still the preferable method.
Good day @webaware thank you for your help, I forwarded the PR to our dev :D