Qio
Qio
### Description When I use @ai-sdk/openai to access openai models which only support Response API(like gpt-5.1-codex), and I set the store to false & "include": ["reasoning.encrypted_content"], the second request fails...
## Environment - Platform (select one): - [ ] Anthropic API - [x] AWS Bedrock - [ ] Google Vertex AI - [ ] Other: - Claude CLI version: 1.0.102...
### Description when I use @ai-sdk/openai-compatible to create a provider, and ues the GPT 5.x model, the body of the request contains max_tokens, not max_completion_tokens, so the LLM returned an...
Fixes Azure reasoning models (GPT-5, o1) by using `max_completion_tokens` instead of `max_tokens`.
Currently, the default max output token for opencode is 32,000, however, the latest models such as GPT, Gemini, and Claude now support max output tokens exceeding 64,000, so the current...