Reasoning is not returned
Unlike other AI SDK providers that support reasoning, the reasoning property is not being returned then using the open router provider.
to clarify relative to #22 - we're seeing this issue on generateObject not streamText ....
I can confirm that when using plain generateText() that there is no reasoning content in the response, unlike like with using the API directly (same model -> deepseek/deepseek-r1-0528:free). I cannot test streamText() as this is currently broken -> #106
Here is an example result without reasoning:
{
"steps": [
{
"content": [
{
"type": "text",
"text": "To solve \\(15 \\times 23 + 7\\), follow the order of operations (PEMDAS/BODMAS), which dictates that multiplication and division are performed before addition and subtraction.\n\n1. **Perform the multiplication**: Calculate \\(15 \\times 23\\).\n - Break it down: \\(15 \\times 23 = (10 \\times 23) + (5 \\times 23)\\).\n - \\(10 \\times 23 = 230\\)\n - \\(5 \\times 23 = 115\\)\n - Add the results: \\(230 + 115 = 345\\).\n - Thus, \\(15 \\times 23 = 345\\).\n\n2. **Perform the addition**: Add 7 to the result from step 1.\n - \\(345 + 7 = 352\\).\n\n**Final Answer:** \n\\[ \\boxed{352} \\]"
}
],
"finishReason": "stop",
"usage": {
"inputTokens": 23,
"outputTokens": 531,
"totalTokens": 554,
"reasoningTokens": 0,
"cachedInputTokens": 0
},
"warnings": [],
"request": {
"body": {
"model": "deepseek/deepseek-r1-0528:free",
"messages": [
{
"role": "user",
"content": "Solve this step by step: What is 15 * 23 + 7?"
}
]
}
},
"response": {
"id": "gen-1752580221-Fy67T72CCGkfzOkdzunL",
"timestamp": "2025-07-15T11:50:34.615Z",
"modelId": "deepseek/deepseek-r1-0528:free",
"headers": {
"access-control-allow-origin": "*",
"cf-ray": "95f90aae49ca4541-TXL",
"connection": "keep-alive",
"content-encoding": "gzip",
"content-type": "application/json",
"date": "Tue, 15 Jul 2025 11:50:25 GMT",
"permissions-policy": "payment=(self \"https://checkout.stripe.com\" \"https://connect-js.stripe.com\" \"https://js.stripe.com\" \"https://*.js.stripe.com\" \"https://hooks.stripe.com\")",
"referrer-policy": "no-referrer, strict-origin-when-cross-origin",
"server": "cloudflare",
"transfer-encoding": "chunked",
"vary": "Accept-Encoding",
"x-content-type-options": "nosniff"
},
"messages": [
{
"role": "assistant",
"content": [
{
"type": "text",
"text": "To solve \\(15 \\times 23 + 7\\), follow the order of operations (PEMDAS/BODMAS), which dictates that multiplication and division are performed before addition and subtraction.\n\n1. **Perform the multiplication**: Calculate \\(15 \\times 23\\).\n - Break it down: \\(15 \\times 23 = (10 \\times 23) + (5 \\times 23)\\).\n - \\(10 \\times 23 = 230\\)\n - \\(5 \\times 23 = 115\\)\n - Add the results: \\(230 + 115 = 345\\).\n - Thus, \\(15 \\times 23 = 345\\).\n\n2. **Perform the addition**: Add 7 to the result from step 1.\n - \\(345 + 7 = 352\\).\n\n**Final Answer:** \n\\[ \\boxed{352} \\]"
}
]
}
]
},
"providerMetadata": {
"openrouter": {
"usage": {
"promptTokens": 23,
"completionTokens": 531,
"totalTokens": 554,
"promptTokensDetails": {
"cachedTokens": 0
},
"completionTokensDetails": {
"reasoningTokens": 0
}
}
}
}
}
]
}
Looks like I may have responded to an unrelated issue, sorry for that.
The older release works perfectly fine and return reasoning content as expected:
"@openrouter/ai-sdk-provider": "^0.7.2",
"ai": "^4.3.16",
The current 1.0.0-beta* release seems to be the culprit, which uses the v5 beta from the Vercel AI SDK.
The problem is that currently the 1.0.0-beta* is the default version, which gets installed if you follow the README.md instructions.
Oh oops
I think this can now get closed as fixed in current release? @louisgv
I think so, please feel free to reopen if there's still issues