phpbu icon indicating copy to clipboard operation
phpbu copied to clipboard

backblazeb2 S3 errors and bad reporting

Open planetahuevo opened this issue 3 years ago • 6 comments

Hi, I am testing backblazeb2 and it is giving me some errors.

It seems to be the same error twice. I have a backup config file with 3 backups, 2 of them failed the 3rd one worked. The config is the same. This is the error:

PHP Warning:  Error executing "PutObject" on "https://s3.[regionedited].backblazeb2.com/folder/myfile.sql.gz"; AWS HTTP error: Client error: `PUT https://s3.[regionedited].backblazeb2.com/folder/myfile.sql.gz` resulted in a `400 Bad Request` response:
    <?xml version="1.0" encoding="UTF-8" standalone="yes"?>
    <Error>
        <Code>InvalidRequest</Code>
        <Message>Missing req (truncated...)
     InvalidRequest (client): Missing required header for this request: Content-MD5 - <?xml version="1.0" encoding="UTF-8" standalone="yes"?>
    <Error>
        <Code>InvalidRequest</Code>
        <Message>Missing required header for this request: Content-MD5</Message>
    </Error>
     in phar:///usr/local/bin/phpbu/lib/aws-sdk/S3/StreamWrapper.php on line 746

And the second one is similar:

PHP Warning:  Error executing "PutObject" on "https://s3.[regionedited].backblazeb2.com/folder/myfile.gz"; AWS HTTP error: Client error: `PUT https://s3.[regionedited].backblazeb2.com/folder/myfile.gz` resulted in a `400 Bad Request` response:
    <?xml version="1.0" encoding="UTF-8" standalone="yes"?>
    <Error>
        <Code>InvalidRequest</Code>
        <Message>Missing req (truncated...)
     InvalidRequest (client): Missing required header for this request: Content-MD5 - <?xml version="1.0" encoding="UTF-8" standalone="yes"?>
    <Error>
        <Code>InvalidRequest</Code>
        <Message>Missing required header for this request: Content-MD5</Message>
    </Error>
     in phar:///usr/local/bin/phpbu/lib/aws-sdk/S3/StreamWrapper.php on line 746

My config is standard and it is the same for the 3 backups:

          <sync type="backblazes3">

                            <option name="key" value="xxxx"/>
                <option name="secret" value="xxx"/>
                <option name="bucket" value="xxxx"/>
                <option name="region" value="region"/>
                <option name="path" value="/folder"/>
                <option name="useMultiPartUpload" value="true"/>
            </sync> 

Also, the backup was "completed" with no errors, but the files did not end up on B2, only the one without errors. It seems the response 400 was not recognised. "status":0,"timestamp":1660476934,"duration":14.2698,"backupCount":3,"backupFailed":0,"errorCount":0,"errors":[],"backups":[{"name":"db","status":0,"checks":{"executed":1,"failed":0},"crypt":{"executed":0,"skipped":0,"failed":0},"syncs":{"executed":1,"skipped":0,"failed":0},"cleanup":{"executed":1,"skipped":0,"failed":0}},{"name":"files","status":0,"checks":{"executed":1,"failed":0},"crypt":{"executed":0,"skipped":0,"failed":0},"syncs":{"executed":1,"skipped":0,"failed":0},"cleanup":{"executed":1,"skipped":0,"failed":0}},{"name":"uploads","status":0,"checks":{"executed":1,"failed":0},"crypt":{"executed":0,"skipped":0,"failed":0},"syncs":{"executed":1,"skipped":0,"failed":0},"cleanup":{"executed":1,"skipped":0,"failed":0}}],

This is their docs on large files https://www.backblaze.com/b2/docs/large_files.html and their s3 compatible api: https://www.backblaze.com/b2/docs/s3_compatible_api.html

planetahuevo avatar Aug 14 '22 12:08 planetahuevo

It seems the problem is with the lock enabled option of backblaze "When attempting to back up to a bucket on Backblaze B2 that has object-lock enabled (I set the default for 30 days), the upload fails (log below). Without object-lock enabled appears to work fine." https://wordpress.org/support/topic/missing-required-header-for-this-request-content-md5-object-lock-b2/

Object lock is an important feature, so I hope you can add support into it. I am reporting this to B2 too.

planetahuevo avatar Aug 14 '22 12:08 planetahuevo

More info. So this only happens on 2 of those 3 files because the 2 of them were too small to be a large file, but the multiupload was set on true. So that breaks somehow the uploading. Limits Large files can range in size from 5MB to 10TB.

Each large file must consist of at least 2 parts, and all of the parts except the last one must be at least 5MB in size. The last part must contain at least one byte.

planetahuevo avatar Aug 14 '22 12:08 planetahuevo

I have reported this to B2, to see if they can fix it. I have disabled the lock on the B2, for now, until someone fix this on any of the ends :) Would be possible to detect when the file is less than 5MB and then send it using with the multiupload deactivated or it is best that they fix this on B2 end?

planetahuevo avatar Aug 14 '22 12:08 planetahuevo

Thanks for the detailed investigation.

Yes I would argue that this should be fixed on B2s side. Since it only doesn't work if you use a specific setting on B2s side, I'm not sure I can fix it on my side. Especially since B2 is basically just using the AWS implementation with just some setup differences.

sebastianfeldmann avatar Aug 20 '22 08:08 sebastianfeldmann

B2s team is doing a great work testing everything, but they said that, so far, cannot reproduce. I am going to do more testing on my side and give them a test server so they can see the problem. I will let you know when we find the origin of the issue. :)

planetahuevo avatar Aug 20 '22 10:08 planetahuevo

@sebastianfeldmann I was able to replicate it again. The problem is when multipart upload is OFF and the file is less than 100Mbs, because then

A bug on phpbu for you to fix is that phpbu is reporting everything as synced and OK, which is not correct. It is getting as a PHPwarning, when is a 400 error and it should mark the backups as Failed. I am finding a lot of backups that fail but PHPBU does not get the error and sned them as success.

That makes phpbu a little less reliable for backups... and the slow development on the past months is making me consider to find an alternative... :(

planetahuevo avatar Sep 26 '22 19:09 planetahuevo