aider
aider copied to clipboard
feat: Add support for litellm as a provider
- Allow users to configure a litellm API base and key to use a litellm instance as a model provider.
- Implement model discovery from the /models and /model_group/info endpoints of the litellm instance.
- Use the discovered model information, including pricing, to enable cost calculation for litellm models.
- Improve the cost display to show token usage even when the cost is zero.
It's already possible to use the littlem models and proxy
It's already possible to use the littlem models and proxy
This one is actually allowing to list available models as well, as oposed to existing way of specifying with program arguments like
> /models litellm
Models which match "litellm":
- litellm/bedrock-claude-haiku-3.5
- litellm/azure-claude-haiku-4.5
...
as well as adding summary for session / cost calculated from LiteLLM Proxy
Tokens: 1.7k sent, 71 received. Cost: $0.00029 message, $0.0015 session.
to configure using env, without additional configuration
LITELLM_API_BASE="https://litellm.proxy.url" LITELLM_API_KEY="sk-..." aider