opencode icon indicating copy to clipboard operation
opencode copied to clipboard

feat(provider): add Responses API support for custom OpenAI-compatible providers

Open GreyElaina opened this issue 4 days ago • 1 comments

Summary

  • Register @opencode-ai/openai-compatible as bundled provider with Responses API support
  • Add omitMaxOutputTokens option for providers that don't support max_output_tokens parameter
  • Default to responses() API, with option to fallback to chat() via useResponsesApi: false

Motivation

Many OpenAI-compatible API proxies (OpenRouter, NewAPI? etc.) now support OpenAI's /responses endpoint. However, @ai-sdk/openai-compatible only supports Chat Completions API.

This PR exposes the existing internal Responses API implementation (currently only used for GitHub Copilot) to custom providers.

Related: vercel/ai#9723 (stale, 795 commits behind)

Usage

{
  "provider": {
    "my-provider": {
      "npm": "@opencode-ai/openai-compatible",
      "api": "https://my-api.com/v1",
      "env": ["MY_API_KEY"],
      "options": {
        "omitMaxOutputTokens": true
      },
      "models": {
        "gpt-5.2": {}
      }
    }
  }
}

Options

Option Type Default Description
omitMaxOutputTokens boolean false Don't send max_output_tokens parameter (some proxies don't support it)
useResponsesApi boolean true Use Responses API; set to false to use Chat Completions API

GreyElaina avatar Jan 08 '26 09:01 GreyElaina