opencode
opencode copied to clipboard
feat(provider): add Responses API support for custom OpenAI-compatible providers
Summary
- Register
@opencode-ai/openai-compatibleas bundled provider with Responses API support - Add
omitMaxOutputTokensoption for providers that don't supportmax_output_tokensparameter - Default to
responses()API, with option to fallback tochat()viauseResponsesApi: false
Motivation
Many OpenAI-compatible API proxies (OpenRouter, NewAPI? etc.) now support OpenAI's /responses endpoint. However, @ai-sdk/openai-compatible only supports Chat Completions API.
This PR exposes the existing internal Responses API implementation (currently only used for GitHub Copilot) to custom providers.
Related: vercel/ai#9723 (stale, 795 commits behind)
Usage
{
"provider": {
"my-provider": {
"npm": "@opencode-ai/openai-compatible",
"api": "https://my-api.com/v1",
"env": ["MY_API_KEY"],
"options": {
"omitMaxOutputTokens": true
},
"models": {
"gpt-5.2": {}
}
}
}
}
Options
| Option | Type | Default | Description |
|---|---|---|---|
omitMaxOutputTokens |
boolean | false | Don't send max_output_tokens parameter (some proxies don't support it) |
useResponsesApi |
boolean | true | Use Responses API; set to false to use Chat Completions API |