aider icon indicating copy to clipboard operation
aider copied to clipboard

feat: Add support for litellm as a provider

Open drajnic opened this issue 2 months ago • 3 comments

  • Allow users to configure a litellm API base and key to use a litellm instance as a model provider.
  • Implement model discovery from the /models and /model_group/info endpoints of the litellm instance.
  • Use the discovered model information, including pricing, to enable cost calculation for litellm models.
  • Improve the cost display to show token usage even when the cost is zero.

drajnic avatar Sep 13 '25 05:09 drajnic

CLA assistant check
All committers have signed the CLA.

CLAassistant avatar Sep 13 '25 05:09 CLAassistant

It's already possible to use the littlem models and proxy

ant31 avatar Oct 10 '25 17:10 ant31

It's already possible to use the littlem models and proxy

This one is actually allowing to list available models as well, as oposed to existing way of specifying with program arguments like

> /models litellm                                                                                                                                

Models which match "litellm":
- litellm/bedrock-claude-haiku-3.5
- litellm/azure-claude-haiku-4.5
...

as well as adding summary for session / cost calculated from LiteLLM Proxy

Tokens: 1.7k sent, 71 received. Cost: $0.00029 message, $0.0015 session.

to configure using env, without additional configuration

LITELLM_API_BASE="https://litellm.proxy.url" LITELLM_API_KEY="sk-..." aider

drajnic avatar Oct 22 '25 10:10 drajnic