[FEATURE]: Use models.dev data for custom providers
Feature hasn't been suggested before.
- [x] I have verified this feature I'm about to request hasn't been suggested before.
Describe the enhancement you want to request
Currently, when using preconfigured providers (like Anthropic, OpenAI, etc.) with a whitelist, OpenCode automatically fetches model metadata (pricing, context limits, capabilities) from models.dev:
{
"provider": {
"anthropic": {
"whitelist": [
"claude-sonnet-4-5",
"claude-opus-4-5"
]
}
}
}
However, for custom providers, all model metadata must be manually specified:
{
"provider": {
"customprovider": {
"name": "AI Home",
"npm": "@ai-sdk/openai-compatible",
"options": {
"apiKey": "{env:AIHOME_API_KEY}",
"baseURL": "https://api.example.com"
},
"models": {
"minimax/minimax-m2.1": {
"limit": { "context": 1000000, "output": 32000 },
"cost": { "input": 0.3, "output": 1.2 },
"temperature": true,
"reasoning": true,
"tool_call": true
}
}
}
}
}
This is tedious, error-prone, and requires tracking pricing/limit changes manually.
Proposed Solution
Add a use_models_dev option that allows custom providers to reference model metadata from models.dev:
{
"provider": {
"customprovider": {
"name": "AI Home",
"npm": "@ai-sdk/openai-compatible",
"options": {
"apiKey": "{env:API_KEY}",
"baseURL": "https://api.example.com"
},
"models": {
"minimax/minimax-m2.1": {
"use_models_dev": "openrouter/minimax/minimax-m2.1"
}
}
}
}
}
This would automatically import:
- Context/input/output limits
- Pricing (input, output, cache read/write)
- Capabilities (tool calls, reasoning, temperature, attachments)
- Modalities (text, image, audio, video, PDF support)
- Variants and other metadata
Benefits
- Less Configuration - No need to manually track pricing and limits
- Auto-Updates - When models.dev updates pricing or limits, OpenCode automatically uses latest values
- Accuracy - Ensures metadata matches the source provider
- Flexibility - Can still override specific fields when needed
User Override Support
Users should be able to override imported metadata:
{
"models": {
"minimax/minimax-m2.1": {
"use_models_dev": "openrouter/minimax/minimax-m2.1",
"limit": {
"context": 500000 // Override imported value
}
}
}
}
Merge priority: user config > models.dev import > defaults
Error Handling
If the specified use_models_dev model ID is not found in models.dev:
- Log a warning with the invalid ID
- List available providers to help user find correct ID
- Fall back to any manually specified metadata in config
- Don't crash or fail config loading
Finding Model IDs
Model IDs follow the format: provider/model-name
Examples:
-
openrouter/minimax/minimax-m2.1 -
openrouter/anthropic/claude-3.7-sonnet -
anthropic/claude-opus-4-5Users can find valid IDs at https://models.dev/
Use Cases
- Proxy providers - Using OpenRouter, AI Gateway, or other proxies that support models already in models.dev
- Self-hosted deployments - Running models locally that match commercial offerings
- Testing - Quickly switching between providers without reconfiguring metadata
- Multi-provider setups - Accessing the same model through different endpoints
Example: Real-world scenario
A user wants to access Minimax M2.1 through their self-hosted AI proxy. Instead of manually looking up and copying all the metadata, they can simply reference the OpenRouter version:
{
"provider": {
"my-ai-proxy": {
"npm": "@ai-sdk/openai-compatible",
"options": {
"apiKey": "{env:MY_PROXY_KEY}",
"baseURL": "https://my-proxy.internal/v1"
},
"models": {
"minimax-m2.1": {
"use_models_dev": "openrouter/minimax/minimax-m2.1"
}
}
}
}
}
All pricing, limits, and capabilities are automatically imported from models.dev.
Non-Goals
- This feature does NOT add new models to models.dev
- This feature does NOT validate that the provider actually supports the model
- This feature does NOT change how preconfigured providers work.