ai-sdk-provider icon indicating copy to clipboard operation
ai-sdk-provider copied to clipboard

google/gemini-3-pro returns 400 INVALID_ARGUMENT error on tool use

Open ldriss opened this issue 1 month ago • 14 comments

Hi Team

all models work but when i test recent gemini 3 on tool use it fails

Model: google/gemini-3-pro SDK: AI SDK 6 (via @openrouter/ai-sdk-provider) Status Code: 400

Error Response:

{ "error": { "code": 400, "message": "Request contains an invalid argument.", "status": "INVALID_ARGUMENT" }}

Request Config:

Tools: Multiple tool definitions Streaming: streamText() Reasoning config: { effort: 'medium' } (for Gemini models with tools)

This seems to be related to the following updates (from the docs) https://ai.google.dev/gemini-api/docs/thought-signatures

Thanks

ldriss avatar Nov 19 '25 08:11 ldriss

Yeah, I am also experiencing this.

Error log;

{
   "error":{
      "message":"Provider returned error",
      "code":400,
      "metadata":{
         "raw":"Gemini models require OpenRouter reasoning details to be preserved in each request. Please refer to our docs: htt
ps://openrouter.ai/docs/use-cases/reasoning-tokens#preserving-reasoning-blocks. Upstream error: {\\n  \\""error\\"": {\\n    \\""code\\"": 400,\\n    \\""message\\"": \\""Function call is missing a thought_signature in functionCall parts. This is required for tools to work correctly  and missing thought_signature may lead to degraded model performance. Additional data function call`default_api:[redacted]` position 2. please refer to https":",\\n    \\""status\\"": \\""INVALID_ARGUMENT\\""\\n  }\\n}\\n",
         "provider_name":"Google AI Studio"
      }
   },

I tried looking at the documentation but I am unsure how to ensure that the reasoning_detail / thought_signature should be maintained.

This might be upstream related though: https://github.com/vercel/ai/issues/10344

seannetlife avatar Nov 19 '25 09:11 seannetlife

@seannetlife good catch on the upstream issue, Yeah i got this error at first then added reasoning effort check this https://openrouter.ai/docs/use-cases/reasoning-tokens then got this new error

Model it works fine but when it start using the tools it fails

ldriss avatar Nov 19 '25 14:11 ldriss

upstream issue fix has been merged 🎉

seannetlife avatar Nov 19 '25 17:11 seannetlife

On [email protected] and still seeing 400 Request contains an invalid argument. with no other details.

xvvvyz avatar Nov 19 '25 21:11 xvvvyz

@xvvvyz i guess becasue the merge was for gemini sdk provider itself not openrouter provider

ldriss avatar Nov 19 '25 22:11 ldriss

Looks like it's being addressed https://github.com/OpenRouterTeam/ai-sdk-provider/pull/240

xvvvyz avatar Nov 19 '25 23:11 xvvvyz

that has been merged, but the release workflow didn't publish as the version already exists:

https://github.com/OpenRouterTeam/ai-sdk-provider/actions/runs/19523456268/job/55891295794#step:9:104

might be that there is a secondary manual instantiation process around releasing though that needs to be hit.

@mattapperson can you confirm?

seannetlife avatar Nov 20 '25 04:11 seannetlife

facing same issue

AtiqGauri avatar Nov 20 '25 15:11 AtiqGauri

Both upstream and https://github.com/OpenRouterTeam/ai-sdk-provider/pull/240 seems to have been merged, and 30 minutes ago there was a release for 1.2.4, but I am still facing the same error. Is it working for anyone else with the new versions?

I can get a few toolcalls, but it fails on the 3rd or 4th tool call with the same issue.

niloyc avatar Nov 20 '25 16:11 niloyc

@niloyc still lucky to get to the 4th call i can't even call the 1st tool haha

ldriss avatar Nov 20 '25 17:11 ldriss

@ldriss may have been a fluke or something because now I can't get 1 tool call, same as before.

I can see the converted messages from UiMessage to ModelMessage has the providerOptions with reasoning in the toolcalls but still fails.

niloyc avatar Nov 20 '25 17:11 niloyc

v1.2.5 seems to be working for me!

xvvvyz avatar Nov 20 '25 20:11 xvvvyz

@niloyc @xvvvyz yes it working @mattapperson Thanks much appreciated

ldriss avatar Nov 20 '25 22:11 ldriss

I am still having issue with "@openrouter/ai-sdk-provider": "^1.2.8",

This is the response I got

      "message": "Provider returned error",
      "code": 400,
      "metadata": {
        "raw": "Gemini models require OpenRouter reasoning details to be preserved in each request. Please refer to our docs: https://openrouter.ai/docs/guides/best-practices/reasoning-tokens#preserving-reasoning-blocks. Upstream error: {\n  \"error\": {\n    \"code\": 400,\n    \"message\": \"Unable to submit request because function call `default_api:readFile` in the 2. content block is missing a `thought_signature`. Learn more: https://docs.cloud.google.com/vertex-ai/generative-ai/docs/thought-signatures\",\n    \"status\": \"INVALID_ARGUMENT\"\n  }\n}\n",
        "provider_name": "Google"
      }
    },

From previous iteration, I can see the model is making tool call with encrypted reasoning detail

"reasoning_details": [
            {
              "id": "tool_readFile_SOME_ID",
              "type": "reasoning.encrypted",
              "data": "SOME BASE 64 DATA",
              "format": "google-gemini-v1",
              "index": 0
            }
          ],

The error request have this

{
        "role": "assistant",
        "content": "",
        "tool_calls": [
          {
            "id": "tool_readFile_SOME_ID",
            "type": "function",
            "function": {
              "name": "readFile",
              "arguments": "{\"path\":[\"packages/core/src/workflow/agent.workflow.ts\",\"packages/core/src/workflow/agent.workflow.test.ts\"],\"includeIgnored\":false}"
            }
          }
        ],
        "reasoning": "**Investigating the Workflow**\n\nI've located `agent.workflow` within `packages/core/src/workflow/agent.workflow.ts`. Tests exist in a parallel file, `packages/core/src/workflow/agent.workflow.test`. I am now focusing on the typescript file.\n\n\n**Understanding the Code**\n\nI'm now diving into the `agent.workflow.ts` file to comprehend its core functionality.  Simultaneously, I'm reviewing the associated test file to grasp its behavior and how it's validated. My goal is to understand the code's purpose and identify any potential areas for optimization or concern, keeping in mind the user's role.\n\n\n[REDACTED]",
        "reasoning_details": [
          {
            "type": "reasoning.text",
            "text": "**Investigating the Workflow**\n\nI've located `agent.workflow` within `packages/core/src/workflow/agent.workflow.ts`. Tests exist in a parallel file, `packages/core/src/workflow/agent.workflow.test`. I am now focusing on the typescript file.\n\n\n**Understanding the Code**\n\nI'm now diving into the `agent.workflow.ts` file to comprehend its core functionality.  Simultaneously, I'm reviewing the associated test file to grasp its behavior and how it's validated. My goal is to understand the code's purpose and identify any potential areas for optimization or concern, keeping in mind the user's role.\n\n\n[REDACTED]"
          }
        ]
      },

So I guess encrypted reasoning caused some issue? I am also unsure how exactly the thought_signature was provided because I can't see it locally. It must be stored on OpenRouter server side?

Note that some simple use case (so that it does not try to encrypt reasoning) works fine for me.

xlc avatar Nov 29 '25 01:11 xlc