hadoop-connectors icon indicating copy to clipboard operation
hadoop-connectors copied to clipboard

is there any way to set per bucket credentials

Open ruiyang2015 opened this issue 3 years ago • 2 comments

our spark is a long running spark session, but we need to access different bucket during the life span of the spark session, and our permission setup require different credentials for different bucket, but it seems that gcs hadoop connector can only set one set of credentials for all, for s3/azure, they both allow fine-grain access control on bucket/container level, can we do the same for gcs bucket?

ruiyang2015 avatar Sep 25 '21 19:09 ruiyang2015

Does exist any plan to implement this?

gegose avatar Nov 25 '22 23:11 gegose

I'm facing this same challenge with the GCS connector, I also created an issue: https://github.com/GoogleCloudDataproc/hadoop-connectors/issues/1009

josecsotomorales avatar Jun 02 '23 16:06 josecsotomorales