copy-file-by-id fails on large files
copy-file-by-id doesn't seem to use the b2_copy_part api and as such can only copy files up to 5GB in size.
I think you are correct - the current copy-file-by-id has been implemented when b2-sdk-python didn't support b2_copy_part api yet. Later it was implemented and is available here, it's just that the cli code has not switched to the new backend yet. Thanks for the report!
No. It's b2sdk.v2.Bucket.copy that is not fully switched to the new interface.
Fixed it on a branch, but unclear about a detail, will try to merge as soon as I get feedback from another developer
It is likely that the patch can be improved with a server-side change. @NilsIrl do you need the large file copy prototype soon? I could release the temporary implementation and then fix it when a new version of the server is deployed, allowing cleanup of the error handler on the client side.
I used rclone and it worked, so I don't particularly need this at the moment (or in the future).
If you have that prototype version available for Linux, I'd love to give it a shot. I'm re-organizing some files and have a good number that are above the 5GB limit.
This was fixed already in the new b2-sdk-python, you can pip install -U b2sdk without even updating the cli and it should work