FR: add argument to set litellm.drop_params=True
Issue
Hi,
litellm is used to easily switch LLM providers and models but it's not perfect and in some situation there's a mismatch between the arguments used when querying the model and what the model actually expects.
For example is you set a "logit_bias" argument to a non openai model. In that case, litellm will crash but if you set litellm.drop_params = True then it won't crash and will just silently remove the extraneous parameters. It can also be used a query time
In my case, I'm using openrouter quite a lot, including to query to anthropic. But AFAIK openrouter does not support streaming argument so aider will crash.
I manually added the drop_params somewhere in the code to make it work until there's a fix but I think it would be nice if aider added an argument to trigger drop_params.
Version and model info
No response
Thanks for trying aider and filing this issue.
I've added drop_params=True in the main branch. You can get it by installing the latest version from github:
python -m pip install --upgrade git+https://github.com/paul-gauthier/aider.git
If you have a chance to try it, let me know if it works better for you.
I'm going to close this issue for now, but feel free to add a comment here and I will re-open or file a new issue any time.