django-storages
django-storages copied to clipboard
(boto3)An error occurred (InvalidArgument) when calling the CreateMultipartUpload operation
When I use boto3(DEFAULT_FILE_STORAGE = 'storages.backends.s3boto3.S3Boto3Storage'),I get the below error. But when I use boto2(DEFAULT_FILE_STORAGE = 'storages.backends.s3boto.S3BotoStorage'),everything works well. Are there some bugs with boto3??
Traceback:
File "/home/yzxn1709/git-lab/top_cpt/env/local/lib/python2.7/site-packages/django/core/handlers/base.py" in get_response
132. response = wrapped_callback(request, *callback_args, **callback_kwargs)
File "/home/yzxn1709/git-lab/top_cpt/env/local/lib/python2.7/site-packages/django/views/decorators/csrf.py" in wrapped_view
58. return view_func(*args, **kwargs)
File "/home/yzxn1709/git-lab/top_cpt/env/local/lib/python2.7/site-packages/rest_framework/viewsets.py" in view
83. return self.dispatch(request, *args, **kwargs)
File "/home/yzxn1709/git-lab/top_cpt/env/local/lib/python2.7/site-packages/rest_framework/views.py" in dispatch
483. response = self.handle_exception(exc)
File "/home/yzxn1709/git-lab/top_cpt/env/local/lib/python2.7/site-packages/rest_framework/views.py" in handle_exception
443. self.raise_uncaught_exception(exc)
File "/home/yzxn1709/git-lab/top_cpt/env/local/lib/python2.7/site-packages/rest_framework/views.py" in dispatch
480. response = handler(request, *args, **kwargs)
File "/home/yzxn1709/git-lab/top_cpt/env/local/lib/python2.7/site-packages/rest_framework/mixins.py" in create
21. self.perform_create(serializer)
File "/home/yzxn1709/git-lab/top_cpt/env/local/lib/python2.7/site-packages/rest_framework/mixins.py" in perform_create
26. serializer.save()
File "/home/yzxn1709/git-lab/top_cpt/env/local/lib/python2.7/site-packages/rest_framework/serializers.py" in save
214. self.instance = self.create(validated_data)
File "/home/yzxn1709/git-lab/top_cpt/env/local/lib/python2.7/site-packages/rest_framework/serializers.py" in create
906. instance = ModelClass.objects.create(**validated_data)
File "/home/yzxn1709/git-lab/top_cpt/env/local/lib/python2.7/site-packages/django/db/models/manager.py" in manager_method
127. return getattr(self.get_queryset(), name)(*args, **kwargs)
File "/home/yzxn1709/git-lab/top_cpt/env/local/lib/python2.7/site-packages/django/db/models/query.py" in create
348. obj.save(force_insert=True, using=self.db)
File "/home/yzxn1709/git-lab/top_cpt/env/local/lib/python2.7/site-packages/django/db/models/base.py" in save
734. force_update=force_update, update_fields=update_fields)
File "/home/yzxn1709/git-lab/top_cpt/env/local/lib/python2.7/site-packages/django/db/models/base.py" in save_base
762. updated = self._save_table(raw, cls, force_insert, force_update, using, update_fields)
File "/home/yzxn1709/git-lab/top_cpt/env/local/lib/python2.7/site-packages/django/db/models/base.py" in _save_table
846. result = self._do_insert(cls._base_manager, using, fields, update_pk, raw)
File "/home/yzxn1709/git-lab/top_cpt/env/local/lib/python2.7/site-packages/django/db/models/base.py" in _do_insert
885. using=using, raw=raw)
File "/home/yzxn1709/git-lab/top_cpt/env/local/lib/python2.7/site-packages/django/db/models/manager.py" in manager_method
127. return getattr(self.get_queryset(), name)(*args, **kwargs)
File "/home/yzxn1709/git-lab/top_cpt/env/local/lib/python2.7/site-packages/django/db/models/query.py" in _insert
920. return query.get_compiler(using=using).execute_sql(return_id)
File "/home/yzxn1709/git-lab/top_cpt/env/local/lib/python2.7/site-packages/django/db/models/sql/compiler.py" in execute_sql
973. for sql, params in self.as_sql():
File "/home/yzxn1709/git-lab/top_cpt/env/local/lib/python2.7/site-packages/django/db/models/sql/compiler.py" in as_sql
931. for obj in self.query.objs
File "/home/yzxn1709/git-lab/top_cpt/env/local/lib/python2.7/site-packages/django/db/models/fields/files.py" in pre_save
314. file.save(file.name, file, save=False)
File "/home/yzxn1709/git-lab/top_cpt/env/local/lib/python2.7/site-packages/django/db/models/fields/files.py" in save
93. self.name = self.storage.save(name, content, max_length=self.field.max_length)
File "/home/yzxn1709/git-lab/top_cpt/env/local/lib/python2.7/site-packages/django/core/files/storage.py" in save
63. name = self._save(name, content)
File "/home/yzxn1709/git-lab/top_cpt/env/local/lib/python2.7/site-packages/storages/backends/s3boto3.py" in _save
452. self._save_content(obj, content, parameters=parameters)
File "/home/yzxn1709/git-lab/top_cpt/env/local/lib/python2.7/site-packages/storages/backends/s3boto3.py" in _save_content
467. obj.upload_fileobj(content, ExtraArgs=put_parameters)
File "/home/yzxn1709/git-lab/top_cpt/env/local/lib/python2.7/site-packages/boto3/s3/inject.py" in object_upload_fileobj
509. ExtraArgs=ExtraArgs, Callback=Callback, Config=Config)
File "/home/yzxn1709/git-lab/top_cpt/env/local/lib/python2.7/site-packages/boto3/s3/inject.py" in upload_fileobj
427. return future.result()
File "/home/yzxn1709/git-lab/top_cpt/env/local/lib/python2.7/site-packages/s3transfer/futures.py" in result
73. return self._coordinator.result()
File "/home/yzxn1709/git-lab/top_cpt/env/local/lib/python2.7/site-packages/s3transfer/futures.py" in result
233. raise self._exception
Exception Type: ClientError at /v1/api/package
Exception Value: An error occurred (InvalidArgument) when calling the CreateMultipartUpload operation: Unknown
I had this when I set default_acl
to None
instead of its default of 'public-read'
(or the value I ended up setting, 'private'
)
Also had this when I set AWS_DEFAULT_ACL='public-read'
This happened to me because I was using S3BotoStorage
like FileSystemStorage
and using the location as the first argument. Example: S3BotoStorage(LOCATION)
It turns out that acl
is actually the first argument. So, it was sending the location as an ACL, which causes the InvalidArgument
error.
I am running into this. Small files upload successfully but bigger files fail with this error.
I found this thread about the same issue happening in the AWS cli: https://github.com/aws/aws-cli/issues/1674 however it mentions it was fixed, and I still have the problem
The specific error from the DEBUG log:
2019-11-16 19:32:01,760 DEBUG: CompleteMultipartUploadTask(transfer_id=0, {'bucket': 'rpz-staging-experiments', 'key': '6089694926ecaeb1347463d50b375e06ddd01895a02c8d2e90ccadfeb5d6d509', 'extra_args': {}}) about to wait for <s3transfer.futures.ExecutorFuture object at 0x7f2f5934bfd0> 2019-11-16 19:32:01,764 DEBUG: Response headers: {'X-GUploader-UploadID': 'AEnB2UqDCgJB6KnzoqiGRvBDz03dBmqe9jQnlj--nGFvBq6mgq9sSd_-GfnRHayamMPgMaUg2w3aDhO596zFcXp4-t0wnwIhm8fUIpMkFPHp0QtoHe7Uf7c', 'Content-Type': 'application/xml; charset=UTF-8', 'Content-Length': '188', 'Vary': 'Origin', 'Date': 'Sat, 16 Nov 2019 19:32:01 GMT', 'Server': 'UploadServer'} 2019-11-16 19:32:01,764 DEBUG: Response body: b"<?xml version='1.0' encoding='UTF-8'?><Error><Code>InvalidArgument</Code><Message>Invalid argument.</Message><Details>POST object expects Content-Type multipart/form-data</Details></Error>"
Thanks for the error log. I'm still not sure what is going on.
Which way are you using it? Via the File
or the Storage
? I'm curious if one will work where the other fails. It's strange because all of the other issues I see with the same message appear to be either fixed or Google Cloud issues.
I'm filing this particular bug against boto3 directly, since the problem doesn't seem with django-storage itself: https://github.com/boto/boto3/issues/2207
Probably a Google Cloud incompatibility :cold_sweat: not hopeful to see a fix
For future reference, it's possible to work around this issue by patching S3Boto3Storage._save
and changing the upload_fileobj
call to use boto3.s3.transfer.TransferConfig
, e.g.:
MB=1024**2
obj.upload_fileobj(content, ExtraArgs=params, Config=TransferConfig(multipart_threshold=100 * MB))
Could this maybe be added as a setting? Might help people switching providers to GCP but wanting to keep the S3 API usage.
It's a bit unclear to me what are the downsides of forcing the non-multipart upload for bigger files. I see only the issue of broken big uploads needing to be retried from the start (as opposed to retrying from the failed chunk), but I may be overlooking some bigger issue.
This is no longer relevant due to the GCP backend. Also, TransferConfig was added as a setting anyway.