elementary
elementary copied to clipboard
Add file chunk uploading for Azure Blob Storage Integration
Is your feature request related to a problem? Please describe. Uploading large files (>50MB) to blob storage fails.
Describe the solution you'd like Use file chunking in the Azure client.py:
block_list=[]
chunk_size=1024*1024*4
with open(local_html_file_path, "rb") as data:
while True:
read_data = data.read(chunk_size)
if not read_data:
break # done
blk_id = str(uuid.uuid4())
blob_handle.stage_block(block_id=blk_id,data=read_data)
block_list.append(BlobBlock(block_id=blk_id))
blob_handle.commit_block_list(block_list)
I have tested the above by updating the client.py locally. The code above needs review and some love (removing the while true
would be first on the list). <3
Describe alternatives you've considered The only alternative I've considered is performing the upload outside of elementary.
Additional context Nope.
Would you be willing to contribute this feature? I would, if I had time. I'll leave it in your capable hands.
Hi @bdstout , Thanks for opening this issue and apologies for the late reply! This makes a lot of sense, however due to a significant backlog I don't believe the Elementary team will be able to get to it in the near future, so I'm adding the "Open to Contribution" label.
If you are willing to add as a contribution I will love to provide further guidance.
Thanks, Itamar