[Feature Request] Support traditional Chat Completions API for OpenAI provider
Feature hasn't been suggested before.
- [x] I have verified this feature I'm about to request hasn't been suggested before.
Describe the enhancement you want to request
Description
The OpenAI provider in opencode uses the new Responses API format (SSE event stream) instead of the traditional Chat Completions API. Most OpenAI-compatible proxies only implement the Chat Completions API, causing compatibility issues.
Current Behavior
When configuring the OpenAI provider to point to an OpenAI-compatible proxy (e.g., Antigravity, LM Studio, Ollama, etc.), the proxy returns standard Chat Completions format, but opencode expects Responses API format, resulting in parse failures.
Error Message
AI_TypeValidationError: Type validation failed
Invalid input: expected "response.output_text.delta"
Reproduction Steps
- Configure OpenAI provider to point to an OpenAI-compatible proxy:
{
"provider": {
"openai": {
"options": {
"apiKey": "your-api-key",
"baseURL": "http://localhost:8045/v1"
}
}
}
}
- Run any opencode command
- Parse error occurs
Environment
- OS: Windows 11
- opencode version: v1.0.150+
- oh-my-opencode version: v2.14.0
Root Cause
opencode's OpenAI provider uses OpenAI's new Responses API format instead of the widely-adopted Chat Completions API. Most OpenAI-compatible proxies only implement Chat Completions API.
Workaround
Use Anthropic provider instead:
If your proxy supports Anthropic API format (/v1/messages):
{
"provider": {
"anthropic": {
"options": {
"apiKey": "your-api-key",
"baseURL": "http://localhost:8045/v1"
}
}
}
}
Expected Behavior
Ideally, opencode should support both:
- Traditional Chat Completions API (for compatibility with most proxies)
- New Responses API (for official OpenAI)
Or at least provide a configuration option to choose between them.
Proposed Solutions
- Add a
useResponsesApiorapiVersionoption in OpenAI provider config - Fallback to Chat Completions if Responses API is not supported
- Support both formats based on provider configuration
Additional Notes
This is a design limitation, not a bug per se. However, it significantly reduces compatibility with OpenAI-compatible proxies, which are widely used by developers.