aws-sdk-pandas icon indicating copy to clipboard operation
aws-sdk-pandas copied to clipboard

Exception ignored in: <function _S3ObjectBase.__del__ at 0x7f8076cadab0>

Open AlkaSaliss opened this issue 1 year ago • 0 comments

Describe the bug

When trying to write a dataframe to an s3 path for which we don't have write permission, the access denied error is caught but another exception is ignored as shown in below stack trace:

Exception ignored in: <function _S3ObjectBase.__del__ at 0x7f8076cadab0>
Traceback (most recent call last):
  File "/home/ec2-user/SageMaker/.venv/lib/python3.10/site-packages/awswrangler/s3/_fs.py", line 253, in __del__
    self.close()
  File "/home/ec2-user/SageMaker/.venv/lib/python3.10/site-packages/awswrangler/s3/_fs.py", line 483, in close
    _utils.try_it(
  File "/home/ec2-user/SageMaker/.venv/lib/python3.10/site-packages/awswrangler/_utils.py", line 789, in try_it
    return f(*args, **kwargs)
  File "/home/ec2-user/SageMaker/.venv/lib/python3.10/site-packages/botocore/client.py", line 553, in _api_call
    return self._make_api_call(operation_name, kwargs)
  File "/home/ec2-user/SageMaker/.venv/lib/python3.10/site-packages/botocore/client.py", line 1009, in _make_api_call
    raise error_class(parsed_response, operation_name)
botocore.exceptions.ClientError: An error occurred (AccessDenied) when calling the PutObject operation: User: arn:aws:sts::123456789:assumed-role/my-role/SageMaker is not authorized to perform: s3:PutObject on resource: "arn:aws:s3:::some-bucket-i-dont-have-write-access/path1/path2/file.csv" because no identity-based policy allows the s3:PutObject action
Caught exception is: ---  An error occurred (AccessDenied) when calling the PutObject operation: User: arn:aws:sts::123456789:assumed-role/my-role/SageMaker is not authorized to perform: s3:PutObject on resource: "arn:aws:s3:::some-bucket-i-dont-have-write-access/path1/path2/file.csv" because no identity-based policy allows the s3:PutObject action

How to Reproduce

import awswrangler as wr
import pandas as pd


try:
    wr.s3.to_csv(
        pd.DataFrame({"col1": [1, 2, 3]}),
        "s3://some-bucket-i-dont-have-write-access/path1/path2/file.csv",
    )
except Exception as e:
    print("Caught exception is: ---", e)

Expected behavior

All raised exceptions should be caught in the except block, but it seems that there's one that is being ignored as shown in the stack trace

Your project

No response

Screenshots

No response

OS

Linux

Python version

3.10.12

AWS SDK for pandas version

3.9.1

Additional context

The code is being run in AWS Sagemaker, and here are the boto versions used: boto3==1.34.48 botocore==1.34.48

AlkaSaliss avatar Sep 27 '24 10:09 AlkaSaliss