Using browser authorization url but shows "401 Error: deleted_client"
I tried to use browser authorization but kept getting the 401.

from gcsfs import GCSFileSystem
fs = GCSFileSystem(token="browser")
system or package info:
- macOS: 13.0.1
- Python:
3.8.13 - fsspec:
2022.10.0,2022.3.0,2022.2.0 - gcsfs:
2022.10.0,2022.3.0,2022.2.0
I've tried several gcsfs version but still get 401.
Can you please try with the following?
--- a/gcsfs/credentials.py
+++ b/gcsfs/credentials.py
@@ -22,9 +22,9 @@ logger = logging.getLogger("gcsfs.credentials")
tfile = os.path.join(os.path.expanduser("~"), ".gcs_tokens")
not_secret = {
- "client_id": "586241054156-9kst7ltfj66svc342pcn43vp6ta3idin"
- ".apps.googleusercontent.com",
- "client_secret": "xto0LIFYX35mmHF9T1R2QBqT",
+ "client_id": "586241054156-1kkkvurv2ubnpnmmebs1bj4tc8j8"
+ "nica.apps.googleusercontent.com",
+ "client_secret": "vWcs6cG8VdCBt1yd9viU23_s",
}
(or you can dynamically update gcsfs.credentials.client_config["installed"] if you prefer)
Can you please try with the following?
(or you can dynamically update
gcsfs.credentials.client_config["installed"]if you prefer)
Thanks @martindurant , I updated the client config dynamically as follows:
import gcsfs
from gcsfs import GCSFileSystem
gcsfs.credentials.client_config["installed"] = {
"client_id": "586241054156-1kkkvurv2ubnpnmmebs1bj4tc8j8"
"nica.apps.googleusercontent.com",
"client_secret": "vWcs6cG8VdCBt1yd9viU23_s",
"auth_uri": "https://accounts.google.com/o/oauth2/auth",
"token_uri": "https://accounts.google.com/o/oauth2/token",
}
fs = GCSFileSystem(token="browser")
The new client_id and client_secret are working. I can see the login page.

However, the next page showed this app is blocked after I logged in.

- Q1: It's worked after changing the
client_idandclient_secret. Is that mean google's oauth2 server-side changed or just the oldclient_idandclient_secretare invalided? - Q2: How to avoid the
this app is blockedif I want to use browser authorization in my environment? such as the Jupyter notebook instance on a Kubeflow
Hello I'm facing the same issue the workaround didn't work for me:
(import gcsfs from gcsfs import GCSFileSystem
"client_id": "586241054156-1kkkvurv2ubnpnmmebs1bj4tc8j8"
"nica.apps.googleusercontent.com",
"client_secret": "vWcs6cG8VdCBt1yd9viU23_s",
"auth_uri": "https://accounts.google.com/o/oauth2/auth",
"token_uri": "https://accounts.google.com/o/oauth2/token",
}
fs = GCSFileSystem(token="browser")) ```
Does anyone here have a contact with google? The client used in this package is no longer allowed for browser auth by google, and there is no way I know to register a new one for a non-website based flow. Another package, gdrivefs, uses pydata_google_auth to make credentials and this works fine with gdrive (i.e., personal) scopes, but "exposes sensitive information" for GCS. Can anyone help?
@magloirend , in practice the only workaround for now is to download a JSON token file from your gcloud console and/or use the gcloud CLI locally and whatever magic it performs to get a local credentials file.
Alright thanks you for your asnwer but i have also the bug inside the cloud function when i use the token='cloud' am i the only one?