Unexpected Model Warning for `gpt-4o-mini` Despite No Specification: Possible Default Configuration Issue
Aider version: 0.58.1 Python version: 3.12.3 Platform: Windows-10-10.0.19045-SP0 Python implementation: CPython Virtual environment: No OS: Windows 10 (64bit) Git version: git version 2.45.2.windows.1
Aider v0.58.1 Main model: openrouter/openai/gpt-4o with architect edit format Editor model: openrouter/anthropic/claude-3.5-sonnet with editor-diff edit format Weak model: openrouter/openai/gpt-4o-mini Git repo: .git with 7,655 files Warning: For large repos, consider using --subtree-only and .aiderignore See: https://aider.chat/docs/faq.html#can-i-use-aider-in-a-large-mono-repo Repo-map: using 1024 tokens, auto refresh
Command and Output
PS C:\Qt\5.15.2\Src\qtbase\src> aider --model openrouter/openai/gpt-4o --editor-model openrouter/anthropic/claude-3.5-sonnet --map-tokens 1024 --architect
────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
Warning for openrouter/openai/gpt-4o-mini: Unknown context window size and costs, using sane defaults.
Did you mean one of these?
- openrouter/openai/gpt-4
- openrouter/openai/gpt-4o
- openrouter/openai/o1-mini
For more info, see: https://aider.chat/docs/llms/warnings.html
You can skip this check with --no-show-model-warnings
Proceed anyway? (Y)es/(N)o [Yes]: n
What exactly is going on here? I never specified gpt-4o-mini. My guess is on startup aider's source code is attempting to set some defaults and one of the defaults is conditionally set based on if i'm using openrouter-related models, and it just happens to have a typo?
More importantly I can't seem to find a ocmmand to change the weaker model.
If I am missing something obvious I do apologize.
Workaround: --weak-model openrouter/anthropic/claude-3-haiku seems to work if appended to the end.
Thank you for filing this issue.
You can ignore this warning or use a different model.