velociraptor icon indicating copy to clipboard operation
velociraptor copied to clipboard

Feature request: Azure blob storage offline collector upload.

Open ch0wm3in opened this issue 4 years ago • 3 comments

Hi,

I think it would be a great addition to the offline collector and vql plugins in general, that you could upload to Azure blob storage. The only way right now is to use Azure blob storage from velocirapor is using s3proxy: https://github.com/gaul/s3proxy to translate the api communication to above.

But the way it works, is that you provide it with 'access keys' from azure, which are not granular in rights and provide 100% administrative rights, hence the key from the client could be extracted and used for malicious purposes eg. downloading all your forensic collections.

Azure blob storage provides SAS - Shared Access Signature(S3 equivelant would be 'Presigned URL') instead where you can assign 'write-only' this is not supported in the software above afaik, and i think this would be the way to support blob storage in velociraptor. There is a 'preview' golang sdk for it here: https://github.com/Azure/azure-storage-blob-go

Appreciate the product, keep the good work up! 👍

ch0wm3in avatar Jun 29 '21 18:06 ch0wm3in

The issue we have with signed urls is that the URL is single use - so it will be fine to create a single collector but if you try to run the collector on multiple systems then the first will work and the second might overwrite the first one or fail depending on configuration. This is why we need a bucket and we make the name of the object based on the hostname and timestamp so all the collections end up in the same place.

Does Azure have a storage service with bucket specific ACLs like AWS and GCP? I guess they must? Maybe we can add an azure storage backend for it.

Alternatively maybe you can use an sftp gateway like this one?

https://github.com/Azure-Samples/sftp-creation-template/tree/master/

scudette avatar Jun 30 '21 06:06 scudette

Well i did not know that pre signed url at S3 was single-use, but the SAS at azure is not, it's actually time based and they do not even support single use afaik https://feedback.azure.com/forums/217298-storage/suggestions/6070592-one-time-use-sas-tokens

The signature could be valid for years.

Azure blob storage under storage account is their direct equivalent to S3 I'm afraid.

The sftp Gateway would still make it it as attractive as the S3 proxy solution, less simple and more overhead.

ch0wm3in avatar Jun 30 '21 22:06 ch0wm3in

The main alternate workaround is using sftp https://docs.microsoft.com/en-us/azure/storage/blobs/secure-file-transfer-protocol-support or download/use/embed azcopy https://github.com/Azure/azure-storage-azcopy which is golang base but does not seem to be usable as library from https://github.com/Azure/azure-storage-azcopy/issues/722 the azblob library seems the other option https://docs.microsoft.com/en-us/samples/azure-samples/storage-blobs-go-quickstart/storage-blobs-go-quickstart/

issue with sftp: it is more likely to be network restricted than https

juju4 avatar Feb 19 '22 14:02 juju4

This is now implemented natively in #2616

scudette avatar Apr 25 '23 05:04 scudette