feat: add litellmProxy provider option for explicit LiteLLM compatibility
Summary
This PR builds on #8497 and addresses @rekram1-node's feedback from #8248:
"There are a LOT of providers that don't have this issue... ideally this would be strictly limited to litellm"
Instead of injecting the _noop tool unconditionally for all providers, this change restricts it to only LiteLLM-backed providers.
Problem
#8497 injects the _noop tool unconditionally when message history contains tool calls. While this fixes the LiteLLM compatibility issue, it unnecessarily affects providers that don't need it (Anthropic native, OpenRouter, Vertex, Bedrock, etc).
Solution
Add litellmProxy provider option and restrict _noop injection to LiteLLM providers:
const isLiteLLMProxy =
provider.options?.["litellmProxy"] === true ||
input.model.providerID.toLowerCase().includes("litellm") ||
input.model.api.id.toLowerCase().includes("litellm")
if (isLiteLLMProxy && Object.keys(tools).length === 0 && hasToolCalls(input.messages)) {
tools["_noop"] = tool({ ... })
}
Detection methods:
- Auto-detect: Provider ID or API ID contains "litellm"
-
Explicit opt-in: Provider has
litellmProxy: trueoption (for custom gateways)
Config example:
{
"provider": {
"my-gateway": {
"api": "openai",
"options": {
"litellmProxy": true
}
}
}
}
Changes
| File | Change |
|---|---|
packages/opencode/src/session/llm.ts |
Add isLiteLLMProxy check with auto-detection + opt-in support |
packages/opencode/src/session/message-v2.ts |
Handle pending/running tool calls (from #8497) |
packages/opencode/test/session/llm.test.ts |
Tests for hasToolCalls helper (from #8497) |
packages/opencode/test/session/message-v2.test.ts |
Tests for pending/running tool conversion (from #8497) |
Testing
- All tests pass (18 tests across 2 files)
- Manual testing with custom LiteLLM gateway confirmed working
Related
- Builds on #8497
- Addresses feedback from #8248
- Fixes #8246
- Fixes #2915
Co-authored-by: Mark Henderson [email protected]