litellm
litellm copied to clipboard
[Bug]: Gemini 3.0 requires a thought_signature for tool calls
What happened?
Gemini 3.0 requires a thought_signature for tool calls, but litellm doesn't preserve this signature in streaming mode, causing errors on subsequent turns.
Relevant log output
[STREAM] Processing error: litellm.ServiceUnavailableError: litellm.MidStreamFallbackError: litellm.BadRequestError: Vertex_ai_betaException BadRequestError - b'{\n "error": {\n "code": 400,\n "message": "Function call is missing a thought_signature in functionCall parts. This is required for tools to work correctly, and missing thought_signature may lead to degraded model performance. Additional data, function call `default_api:get_latest_news_summary` , position 2. Please refer to https://ai.google.dev/gemini-api/docs/thought-signatures for more details.",\n "status": "INVALID_ARGUMENT"\n }\n}\n' Original exception: BadRequestError: litellm.BadRequestError: Vertex_ai_betaException BadRequestError - b'{\n "error": {\n "code": 400,\n "message": "Function call is missing a thought_signature in functionCall parts. This is required for tools to work correctly, and missing thought_signature may lead to degraded model performance. Additional data, function call `default_api:get_latest_news_summary` , position 2. Please refer to https://ai.google.dev/gemini-api/docs/thought-signatures for more details.",\n "status": "INVALID_ARGUMENT"\n }\n}\n'
Are you a ML Ops Team?
No
What LiteLLM version are you on ?
v1.80.0
Twitter / LinkedIn details
No response
Hey, Looks like there is a PR open for this fix: https://github.com/BerriAI/litellm/pull/16895
@renning22 This should be fixed in the latest release. Can you check? I can reopen the issue if it still doesn't work
@renning22 This should be fixed in the latest release. Can you check? I can reopen the issue if it still doesn't work
thanks, confirm fixed!