[BUG] 400 Reasoning Engine Execution failed on LLM Auditor
400 Reasoning Engine Execution failed
Description of issue
Querying a successfully deployed Vertex AI Agent Engine resource immediately fails with a FAILED_PRECONDITION (400) error, indicating an internal runtime crash (client closed) during session creation.
pip show google-cloud-aiplatform
Name: google-cloud-aiplatform
Version: 1.125.0
The agent runs successfully when tested locally using InMemoryRunner, and the required IAM roles (Vertex AI User) have been granted to the Agent Engine's Service Account. The SDK version (1.125.0) is confirmed to be up-to-date. The problem appears to be a missing configuration detail in the managed cloud environment.
Error log
---
Traceback (most recent call last):
File ".../venv/lib/python3.13/site-packages/google/api_core/grpc_helpers.py", line 75, in error_remapped_callable
return callable_(*args, **kwargs)
... (gRPC internal traces removed for clarity) ...
grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
status = StatusCode.FAILED_PRECONDITION
details = "Reasoning Engine Execution failed.
Please refer to our documentation (https://cloud.google.com/vertex-ai/generative-ai/docs/agent-engine/troubleshooting/use) for checking logs and other troubleshooting tips.
Error Details: {"detail":"Agent Engine Error: An error occurred during invocation. Exception: Cannot send a request, as the client has been closed.\nRequest Data: {'user_id': 'new_user'}"}"
debug_error_string = "UNKNOWN:Error received from peer ipv4:142.250.189.234:443 {grpc_message:"Reasoning Engine Execution failed.\nPlease refer to our documentation (https://cloud.google.com/vertex-ai/generative-ai/docs/agent-engine/troubleshooting/use) for checking logs and other troubleshooting tips.\nError Details: {\"detail\":\"Agent Engine Error: An error occurred during invocation. Exception: Cannot send a request, as the client has been closed.\\nRequest Data: {\'user_id\': \'new_user\'}\"}", grpc_status:9}"
>
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File ".../llm_audit_test.py", line 9, in <module>
session = agent_engine.create_session(user_id="new_user")
File ".../venv/lib/python3.13/site-packages/vertexai/agent_engines/_agent_engines.py", line 1553, in _method
response = self.execution_api_client.query_reasoning_engine(
request=aip_types.QueryReasoningEngineRequest(
),
)
File ".../venv/lib/python3.13/site-packages/google/cloud/aiplatform_v1/services/reasoning_engine_execution_service/client.py", line 866, in query_reasoning_engine
response = rpc(
request,
metadata=metadata,
)
File ".../venv/lib/python3.13/site-packages/google/api_core/gapic_v1/method.py", line 131, in __call__
return wrapped_func(*args, **kwargs)
File ".../venv/lib/python3.13/site-packages/google/api_core/grpc_helpers.py", line 77, in error_remapped_callable
raise exceptions.from_grpc_error(exc) from exc
google.api_core.exceptions.FailedPrecondition: 400 Reasoning Engine Execution failed.
Please refer to our documentation (https://cloud.google.com/vertex-ai/generative-ai/docs/agent-engine/troubleshooting/use) for checking logs and other troubleshooting tips.
Error Details: {"detail":"Agent Engine Error: An error occurred during invocation. Exception: Cannot send a request, as the client has been closed.\nRequest Data: {'user_id': 'new_user'}"}
Reproduction steps or code
- deploy it to vertex ai agent engine
- interact with the deployed agent programmatically in Python using the provided code :https://github.com/google/adk-samples/tree/main/python/agents/llm-auditor
Hi, Can I handle this issue?
Hi, Can I handle this issue? @marktech0813 do you have a solution for this?
Having same issue. Any help will be appreciated. Thanks
sure, I will handle this issue quickly. thanks.
I’m going to update the llm_auditor deployment to use modern, compatible dependency versions for the managed Agent Engine runtime, which likely addresses the httpx “client has been closed” crash coming from older libraries.