Feature Request: Support OpenRouter LLMs with Dynamic Runtime Model Configuration
Please add support for using OpenRouter as a provider for LLMs . Ideally, the integration should allow dynamic registration and switching of any OpenRouter-hosted model at runtime, without requiring code changes or redeployment.
This would enable greater flexibility and extensibility, allowing developers to use a wide range of OpenRouter-compatible models depending on runtime needs.
+1
+1
+1
+1
+1 Any update? Should someone add an extensive prompt to pass to open see bot to work on it? The idea of supporting and OpenAI compatible provider is exciting
+1