cloudserver icon indicating copy to clipboard operation
cloudserver copied to clipboard

Issues putting an object with a non-standard Storage Class

Open alexwlchan opened this issue 6 years ago • 2 comments

Bug report information

Description

If I try to upload an object into a scality-managed bucket with a storage class other than “STANDARD”, the storage class is silently discarded.

I’m using scality/S3 as a mock S3 for some automated tests, and I want to test that my code is correctly uploading certain objects with the correct storage class. This bug makes it harder to do so.

Steps to reproduce the issue

This Python script is a minimal reproduction of the error. It starts the scality/s3 server, creates a new bucket, uploads an object in the Standard-IA storage class, then inspects the object it’s just created.

#!/usr/bin/env python
# -*- encoding: utf-8

from pprint import pprint
import subprocess

import boto3

BUCKET = 'bukkit'
KEY = 'example.txt'

# Start a Docker container running scality/s3server
subprocess.check_call([
    'docker', 'run',
    '--detach',
    '--publish', '8000:8000',
    'scality/s3server:mem-latest'
])

# Create an S3 client that is authenticated against scality/s3,
# and uses the appropriate endpoint.
s3 = boto3.client(
    's3',
    aws_access_key_id='accessKey1',
    aws_secret_access_key='verySecretKey1',
    endpoint_url='http://localhost:8000'
)

# Create a new, empty, bucket, and then upload a single object.
s3.create_bucket(Bucket=BUCKET)
s3.put_object(
    Bucket=BUCKET,
    Key=KEY,
    Body=b'hello world',
    StorageClass='STANDARD_IA'
)

# List the contents of the bucket, and print the metadata about
# the (single) object to stdout.
pprint(s3.list_objects(Bucket=BUCKET)['Contents'][0])

If you don’t have Python installed locally, you can run this script with Docker. Save it to a file called scality_example.py, then run the following command:

docker run \
    --net host \
    --volume /var/run/docker.sock:/var/run/docker.sock \
    --volume $(pwd):/code \
    wellcome/build_tooling python3 /code/scality_example.py

Actual result

The ListObjects call returns the following output:

{'ETag': '"5eb63bbbe01eeed093cb22bb8f5acdc3"',
 'Key': 'example.txt',
 'LastModified': datetime.datetime(2018, 4, 10, 8, 1, 55, 267000, tzinfo=tzlocal()),
 'Owner': {'DisplayName': 'Bart',
           'ID': '79a59df900b949e55d96a1e698fbacedfd6e09d98eacf8f8d5218e7cd47ef2be'},
 'Size': 11,
 'StorageClass': 'STANDARD'}

Note the object has been created with the Standard storage class, not Standard-IA.

Expected result

The object is created with the Standard-IA storage class.

Additional information:

I’m running Docker on macOS 10.12.6:

$ docker --version
Docker version 17.12.0-ce, build c97c6d6

$ docker images | grep scality
scality/s3server                                mem-latest                                 306ca6508aa1        4 months ago        301MB

alexwlchan avatar Apr 10 '18 08:04 alexwlchan

@alexwlchan Thanks for the bug report! We will provide with a patch soon.

rahulreddy avatar Apr 10 '18 16:04 rahulreddy

@rahulreddy any news on that patch? 🙂

c24w avatar May 05 '21 17:05 c24w