Anthropic without OpenRouter middle man
OpenRouter is a nice solution, but since Claude is so popular, it'd be nice to have direct support within Aider so extra costs can be avoided.
I found https://github.com/jtsang4/claude-to-chatgpt which acts as a client-side adapter for Claude models to be used as if they have an OpenAI-compatible API. There's no Claude 3 support quite yet, but perhaps Aider could be used to add support, or maybe we just use claude-to-chatgpt as inspiration for an adapter that lives in Aider's repository.
As a workaround, it appears LiteLLM can be set up locally, as this comment describes.
I guess I'll close this issue, since #172 covers the concern.
Happy to keep this open, as direct Claude support is high priority for me at the moment.
Aider now supports directly connecting to Anthropic and many other LLM providers.
https://aider.chat/docs/llms.html#anthropic
I'm going to close this issue for now, but feel free to add a comment here and I will re-open or file a new issue any time.