[Bug]: vertex ai service account json -unable to resolve 'environment_id' field
What happened?
When using litellm SDK version 1.48.7 like this:
from litellm import completion
import json
## GET CREDENTIALS
file_path = 'PATH_TO_JSON'
# Load the JSON file
with open(file_path, 'r') as file:
vertex_credentials = json.load(file)
# Convert to JSON string
vertex_credentials_json = json.dumps(vertex_credentials)
response = completion(
model="vertex_ai/gemini-pro",
messages=[{"content": "You are a good bot.","role": "system"}, {"content": "tell me poem on pasta","role": "user"}],
vertex_credentials=vertex_credentials_json,
vertex_project="my_project_id",
vertex_location="us-central1"
)
We are seeing error with stack trace:
Traceback (most recent call last):
File "/home/appuser/.local/lib/python3.11/site-packages/litellm/main.py", line 2280, in completion
model_response = vertex_chat_completion.completion( # type: ignore
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/appuser/.local/lib/python3.11/site-packages/litellm/llms/vertex_ai_and_google_ai_studio/gemini/vertex_and_google_ai_studio_gemini.py", line 1208, in completion
_auth_header, vertex_project = self._ensure_access_token(
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/appuser/.local/lib/python3.11/site-packages/litellm/llms/vertex_ai_and_google_ai_studio/vertex_llm_base.py", line 137, in _ensure_access_token
self._credentials, cred_project_id = self.load_auth(
^^^^^^^^^^^^^^^
File "/home/appuser/.local/lib/python3.11/site-packages/litellm/llms/vertex_ai_and_google_ai_studio/vertex_llm_base.py", line 79, in load_auth
creds = identity_pool.Credentials.from_info(json_obj)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/appuser/.local/lib/python3.11/site-packages/google/auth/identity_pool.py", line 425, in from_info
return super(Credentials, cls).from_info(info, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/appuser/.local/lib/python3.11/site-packages/google/auth/external_account.py", line 591, in from_info
return cls(
^^^^
File "/home/appuser/.local/lib/python3.11/site-packages/google/auth/identity_pool.py", line 273, in __init__
raise exceptions.MalformedError(
google.auth.exceptions.MalformedError: Invalid Identity Pool credential_source field 'environment_id'
Whereas when I use vertex ai SDK itself like this with my same service account credentials file like this:
import os
os.environ["GOOGLE_APPLICATION_CREDENTIALS"]="PATH_TO_JSON"
os.environ["VERTEXAI_LOCATION"]="us-central1"
os.environ["VERTEXAI_PROJECT"]="my_project_id""
import vertexai
from vertexai.generative_models import GenerativeModel
vertexai.init(project="my_project_id"", location="us-central1")
model = GenerativeModel("gemini-pro")
response = model.generate_content(
"tell me poem on pasta"
)
print(response.text)
Then it works fine.
My service account file looks like this:
{
"type": "external_account",
"audience": "....",
"subject_token_type": "....",
"service_account_impersonation_url": "https://iamcredentials.googleapis.com/v1/projects/-/serviceAccounts/.....iam.gserviceaccount.com:generateAccessToken",
"token_url": "https://sts.googleapis.com/v1/token",
"credential_source": {
"environment_id": "aws1",
"region_url": "....",
"url": "....",
"regional_cred_verification_url": "...."
}
}
Relevant log output
No response
Twitter / LinkedIn details
No response
google.auth.exceptions.MalformedError: Invalid Identity Pool credential_source field 'environment_id'
this error is coming from the google sdk not litellm
os.environ["GOOGLE_APPLICATION_CREDENTIALS"]="PATH_TO_JSON"
Try doing using the env var for litellm, and see if that works instead @vaghelarahul94
Hi @krrishdholakia Thanks for your response! I appreciate it.
I updated the code to use the os.environ["GOOGLE_APPLICATION_CREDENTIALS"]="PATH_TO_JSON" environment variable. Additionally, I commented out a few other sections.
response = completion(
model="vertex_ai/gemini-pro",
messages=[{"content": "You are a good bot.","role": "system"}, {"content": "tell me poem on pasta","role": "user"}],
#vertex_credentials=vertex_credentials_json,
#vertex_project="my_project_id",
#vertex_location="us-central1"
)
I am now seeing that litellm SDK is asking for this role serviceUsageConsumer ? Whereas vertex ai sdk didn't ask for it. Why when using via litellm sdk this is happening?
litellm.exceptions.BadRequestError: litellm.BadRequestError: VertexAIException BadRequestError - ('Unable to acquire impersonated credentials', '{\n "error": {\n "code": 403,\n "message": "Caller does not have required permission to use project my_project_id. Grant the caller the roles/serviceusage.serviceUsageConsumer role, or a custom role with the serviceusage.services.use permission, by visiting https://console.developers.google.com/iam-admin/iam/project?project=my_project_id and then retry. Propagation of the new permission may take a few minutes.",\n "status": "PERMISSION_DENIED",\n "details": [\n {\n "@type": "type.googleapis.com/google.rpc.Help",\n "links": [\n {\n "description": "Google developer console IAM admin",\n "url": "https://console.developers.google.com/iam-admin/iam/project?project=my_project_id"\n }\n ]\n },\n {\n "@type": "type.googleapis.com/google.rpc.ErrorInfo",\n "reason": "USER_PROJECT_DENIED",\n "domain": "googleapis.com",\n "metadata": {\n "service": "iamcredentials.googleapis.com",\n "consumer": "projects/my_project_id"\n }\n }\n ]\n }\n}\n')
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs.
Facing the same issue
Also have this issue, GCP and the SA is correctly configured.
Unable to repro, when using vertex ai service account uploaded via litellm ui (this does the same thing of converting to a json str)
{
"type": "service_account",
"project_id": "my-id",
"private_key_id": "",
"private_key": "",
"client_email": "",
"client_id": "",
"auth_uri": "",
"token_uri": "",
"auth_provider_x509_cert_url": "",
"client_x509_cert_url": "",
"universe_domain": "googleapis.com"
}
this is my service account structure
How can i create a similar service account to yours? @sammcj @vaghelarahul94
Sure thing, here's my json:
{
"type": "service_account",
"project_id": "redacted",
"private_key_id": "redacted",
"private_key": "redacted",
"client_email": "[email protected]",
"client_id": "redacted",
"auth_uri": "https://accounts.google.com/o/oauth2/auth",
"token_uri": "https://oauth2.googleapis.com/token",
"auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
"client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/samm-vertex-sa%40redacted.iam.gserviceaccount.com",
"universe_domain": "googleapis.com"
}
The SA has access to the usual list of VertexAI services.
litellm config:
...
- model_name: gemini-2.0-pro-exp-02-05
litellm_params:
model: vertex_ai/gemini-2.0-pro-exp-02-05
vertex_project: redacted
vertex_location: us-east-5
@sammcj
I struggled with similar problems.
what worked for me is direct injection. i think the litellm_params is not working for vertex_ai
import ujson
from litellm import completion
import json
# Convert to JSON string
vertex_credentials_json = load(...)
response = completion(
model="vertex_ai/gemini-2.5-pro-preview-03-25",
messages=[{"content": "You are a good bot.","role": "system"}, {"content": "tell me poem on pasta","role": "user"}],
vertex_credentials=vertex_credentials_json,
)
print(response)
I'm able to reproduce consistently
Test
curl "http://localhost:4000/v1/chat/completions" \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $LITELLM_API_KEY" \
-d '{
"model": "vertex_ai/gemini-2.0-flash",
"messages": [
{
"role": "user",
"content": "Write a one-sentence bedtime story about a unicorn."
}
]
}'
Steps to reproduce.
- Remove all docker images related to litellm
- clone litellm repo
- Add necessary .env
LITELLM_MASTER_KEY = "foo" LITELLM_SALT_KEY = "bar" - Update docker-compose.yml (Optional)
environment: LITELLM_LOG: "DEBUG" # Add this DATABASE_URL: "postgresql://llmproxy:dbpassword9090@db:5432/litellm" STORE_MODEL_IN_DB: "True" # allows adding models to proxy via UI - docker-compose up
- http://localhost:4000/ui/?login=success&page=models
- Add Model (Tab)
- Provider : Vertex AI
- LiteLLM Model Name(s) : All Vertex_AI Models (Wildcard)
- Vertex Project : (...)
- Vertex Location : (...)
- Vertex Credentials : (...)
- Test Connect
- Add Model
- Create api key in LiteLLM
- Execute Test command
- FAILS
if ( project_id is not None and credential_project_id and credential_project_id != project_id ): raise ValueError( "Could not resolve project_id. Credential project_id: {} does not match requested project_id: {}".format( _credentials.quota_project_id, project_id ) )But, the code execution doesn't stop. It continues to make API call to Vertex AI with this error as param, due to which Vertex API throws the error
... envionment_id ... docker-compose downdocker-compose up- Execute Test command
- SUCCESS
Probable Cause
_credentials_project_mapping is not updated after adding?
self._credentials_project_mapping: Dict[
Tuple[Optional[VERTEX_CREDENTIALS_TYPES], Optional[str]],
GoogleCredentialsObject,
] = {}
Also experiencing this issue when using a credential-config that is aimed to give AWS Resources access to GCP through identity federation.
To create an AWS-based credential configuration for your project, run:
$ gcloud iam workload-identity-pools create-cred-config \
projects/$PROJECT_NUMBER/locations/$REGION/\
workloadIdentityPools/$WORKLOAD_POOL_ID/providers/$PROVIDER_ID \
--service-account=$EMAIL --aws --enable-imdsv2 \
--output-file=credentials.json
This will create a credentials.json with similar structure to the original post, which LiteLLM throws an error on "unk
"credential_source": {
"environment_id": "aws1",
"region_url": "....",
"url": "....",
"regional_cred_verification_url": "...."
}
litellm.InternalServerError: VertexAIException InternalServerError - Invalid Identity Pool credential_source field 'environment_id'.
However working directly with the vertex sdk as mentioned above works.
I am also getting this exact error. My credential file works as expected with gcloud and the google genai python api.
litellm.APIConnectionError: Invalid Identity Pool credential_source field 'environment_id'
I was able to fix the issue by not passing credentials to the litellm.completion as shown in the docs and setting the following two environment variables
GOOGLE_CLOUD_PROJECT=your-projectGOOGLE_APPLICATION_CREDENTIALS=path/to/credentials.json