hadoop-connectors
hadoop-connectors copied to clipboard
is there any way to set per bucket credentials
our spark is a long running spark session, but we need to access different bucket during the life span of the spark session, and our permission setup require different credentials for different bucket, but it seems that gcs hadoop connector can only set one set of credentials for all, for s3/azure, they both allow fine-grain access control on bucket/container level, can we do the same for gcs bucket?
Does exist any plan to implement this?
I'm facing this same challenge with the GCS connector, I also created an issue: https://github.com/GoogleCloudDataproc/hadoop-connectors/issues/1009