cloudpathlib icon indicating copy to clipboard operation
cloudpathlib copied to clipboard

Python pathlib-style classes for cloud storage services such as Amazon S3, Azure Blob Storage, and Google Cloud Storage.

Results 164 cloudpathlib issues
Sort by recently updated
recently updated
newest added

It came up in code review that doing _list_dir calls and then filtering files introduces extra network calls to check if each file is a file. We can likely do...

Hi, thanks for your awesome library ! Have been searching for a way to use both local and cloud path seamlessy and here it is :D One particular problem I...

design decision

S3 has an interesting situation with folders. Like other object stores like Azure, it has a flat structure, and when you upload a file to `a/b/c.txt` for example, it creates...

help wanted
S3
design decision

Currently, our test suite installs all of the possible dependencies for every cloud provider. This means that errors in the code that occur when only one of the cloud provider...

tests

Hello, Recently I started using cloudpath library to upload entire directories to S3, but suddenly I noticed a weird behavior that we got a high error rate at S3 (according...

bug
S3
caching

Hi, Thank you for the efforts on this package - I appreciate your work! I've got feature proposal because I stumbled about this today and dont think it is implemented....

We probably want to be able to do things like downloads of many files in parallel. Async may help (#28) but some backends may be able to do things like...

enhancement

We should document alternatives to cloudpathlib. The only one I've seen is https://github.com/liormizr/s3path One common way this is done is to have an `## Alternatives` section towards the bottom of...

documentation

The `is_dir` check is fairly expensive, but at least for S3 and Azure when the entries were created as a result of the client's `_list_dir` method, you can tell for...

caching

The existing 'az://' pseudo-URL scheme violates the principal of URLs in that it is not universal and unique. The 's3://' bucket syntax is valid because S3 buckets are universally unique....

help wanted
Azure