spark-bigquery-connector
spark-bigquery-connector copied to clipboard
allow setting temporaryGcsBucket creds in the DataFrameWriter's option
it seems we have to set the global "spark.hadoop.google.cloud.auth.service.account.json.keyfile" for the gcs temporary bucket, we would like to be able to pass in the keyfile for the temporary bucket as an option instead of setting a global spark hadoop config.