llm
llm copied to clipboard
support default query
Some OpenAI-compatible APIs require specific query parameters. This PR adds support for a query configuration in extra-openai-models.yaml, allowing users to define and use these custom APIs more flexibly.
This looks good to me - do you know any examples of providers that need this? I'd rather use a concrete example as opposed to that example_query: "test" one you added.
@simonw Thanks for reviewing this. The need for this comes from an internal private LLM API proxy we use, which relies on module and scene query parameters for QoS control and usage analytics.
Hi @simonw and thanks for a great tool.
I'm helping with some house cleaning and the coding agent sent me here. I suggest we close it and this as the use cases is handled by adding a model plugin.