depot icon indicating copy to clipboard operation
depot copied to clipboard

Add support for upload stream

Open subnix opened this issue 6 years ago • 2 comments

Sometimes we need to upload large files into storage or pass stream to another function. We had only one way before - put file into filesystem and then pass file descriptor to depot, but it requires some space in the filesystem and file processing time increases. I've added create_stream method for the interface and created local storage and GridFS implementations.

subnix avatar Dec 18 '17 11:12 subnix

I understand the need for this patch, but one of the foundations of depot is to provide the same exact interface and features on all its backend (switching from a backend to another shouldn't involve any code change) as the purpose is to allow to use in-memory for tests, local for development, s3 for production and so on.

I don't think there is a way to provide a similar feature on boto and boto3 storages, they can read from a stream, but they can't provide a stream you can write to.

amol- avatar Dec 19 '17 09:12 amol-

What about multipart upload in S3? We can split input stream to parts and upload them via s3 low-level API.

subnix avatar Dec 19 '17 12:12 subnix

Closing this one for the moment, it's still uncertain that streamed uploads can be easily implemented in all backends and we want to ensure all storages supported by depot guarantee the same minimum set of functionalities.

amol- avatar May 03 '23 22:05 amol-