mongodb_consistent_backup
mongodb_consistent_backup copied to clipboard
Optional aws access_key and secret_key parameters
When using upload.method s3, access_key and secret_key seem to be mandatory due to this traceback
mongodb-consistent-backup --host localhost --port 27017 --upload.s3.bucket_name backup-bucket --upload.s3.region us-east-1 --upload.s3.bucket_prefix backup/mongodb --upload.method s3
Traceback (most recent call last):
File "/root/.pex/install/mongodb_consistent_backup-1.2.0-py2-none-any.whl.0ca97fff9d52c9fcb103096a08a99409e2177e0b/mongodb_consistent_backup-1.2.0-py2-none-any.whl/mongodb_consistent_backup/Main.py", line 251, in run
self.backup_directory
File "/root/.pex/install/mongodb_consistent_backup-1.2.0-py2-none-any.whl.0ca97fff9d52c9fcb103096a08a99409e2177e0b/mongodb_consistent_backup-1.2.0-py2-none-any.whl/mongodb_consistent_backup/Upload/Upload.py", line 11, in __init__
self.init()
File "/root/.pex/install/mongodb_consistent_backup-1.2.0-py2-none-any.whl.0ca97fff9d52c9fcb103096a08a99409e2177e0b/mongodb_consistent_backup-1.2.0-py2-none-any.whl/mongodb_consistent_backup/Pipeline/Stage.py", line 45, in init
**self.args
File "/root/.pex/install/mongodb_consistent_backup-1.2.0-py2-none-any.whl.0ca97fff9d52c9fcb103096a08a99409e2177e0b/mongodb_consistent_backup-1.2.0-py2-none-any.whl/mongodb_consistent_backup/Upload/S3/S3.py", line 37, in __init__
self.access_key = self.config.upload.s3.access_key
File "/root/.pex/install/yconf-0.3.4-py2-none-any.whl.4c32acf64a3f2408d71fe4642e2daf2f77807381/yconf-0.3.4-py2-none-any.whl/yconf/util.py", line 52, in __getattr__
raise AttributeError(e)
AttributeError: 'access_key'
In order to use IAM instance profiles, those parameters should be optional.
Thanks @Wiston999,
Forgive my lack of AWS S3 knowledge, are you attempting to upload to a bucket that does not require credentials for uploading?
I assumed that the S3 API required access+secret key but given Python Boto 2.x supports it, I can modify the code to suit your use case once I understand it a bit more. If Boto 2.x doesn't support this we'll need to ask the Boto project (what we use to do S3 uploads) to add this functionality.
Hi, thanks for your answer. Usually when you use boto to connect to AWS services, you have several ways of authenticating with AWS API.
- You can use IAM Roles, that is attached to EC2 instances, an give the instance the ability to perform actions with AWS API without the need of explicit credentials (http://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_use_switch-role-ec2.html).
- You can also use environment variables when using boto as in http://boto.cloudhackers.com/en/latest/s3_tut.html#creating-a-connection.
- And you can use explicit credentials (access & secret key) when invoking the connection constructor. In all the cases, you are authenticating with AWS API.
My use case involves using IAM Roles, so I don't have to be messing with AWS credentials through instances and only the instances I want can upload files to S3.
Thanks @Wiston999, I think that makes sense.
As making the access+secret key optional would create a few other side-effects/problems would it be acceptable to implement a flag such as 'upload.s3.useIAMRoles: true' that allows those parameters to be optional?
Hi, I think it would be acceptable to use the flag if this makes things easier. In any case, I usually make those 2 variables default to None in my own code, because passing None to the constructor is enough to trigger another auth method different than explicit credentials.
I met this problem too, I saw there still need access key to upload to S3. I think if we running this script ar EC2 with IAM Role, boto will handle Credential issue.