adlfs icon indicating copy to clipboard operation
adlfs copied to clipboard

fsspec-compatible Azure Datake and Azure Blob Storage access

Results 108 adlfs issues
Sort by recently updated
recently updated
newest added

**What happened**: I first observed it when I get file with recursive=True argument. For example I have following file on Azure Datalake Gen2: `data/tmp/1.txt` And I want to download it...

When running the code below, with the conda env at the bottom (sorry I can not attach the YAML as a file!), it results in the exception described below. First...

LS is not showing the actual directories. it does not behave as expected for reading directories, but works as expected if we pass it the path to a file. I...

Ideally we wouldn't have a top-level import from this private module. azure.storage.blob removing that would cause a failure at import time in adlfs. We should see why that's needed and...

``` import secrets from adlfs import AzureBlobFileSystem from pprint import pprint BUCKET = "/name/of/the/bucket" BASE_PATH = BUCKET + "/" + secrets.token_hex(12) EMPTY_DIR = BASE_PATH + "/empty_dir/" fs = AzureBlobFileSystem(account_name=_NAME, account_key=_KEY)...

``` import secrets from adlfs import AzureBlobFileSystem BUCKET = 'test-8/' FOLDER = f'test-8/{secrets.token_hex(12)}/' fs = AzureBlobFileSystem(account_name=_NAME, account_key=_KEY) with fs.open(FOLDER + 'foo', 'w') as stream: stream.write('x') with fs.open(FOLDER + 'bar', 'w')...

The README on this repo seems to be the only documentation for this package, and I *think* there's a lot of functionality missing (e.g. there's a reference at the bottom...

Due to changes in the Travis CI billing, the Dask org is migrating Travis CI to GitHub Actions. This repo appears to use Azure Pipelines. As we are putting in...

I have a partitioned parquet file in Blob Storage. I can copy it to local using a wildcard with `get`. Is there syntax to grab the folder without having to...

Doing some development work. Noticed when running `py.test adlfs/tests` it was very slow as it creates multiple azurite docker images. Should it be pinned to latest? ``` (base) ray@ray-MS-7B43:~$ docker...