unstract icon indicating copy to clipboard operation
unstract copied to clipboard

OpenRouter ?

Open FelipeVeiga opened this issue 5 months ago • 1 comments

I'm trying to register a new LLM in the system by selecting "OpenAI" as the provider and changing the endpoint to point to OpenRouter.

However, I'm receiving an error stating that only OpenAI's standard models are allowed when using the "OpenAI" provider. This seems to be a limitation enforced by the platform, but in fact, OpenRouter supports a wide range of models via the OpenAI-compatible API.

How can I proceed to register a model like deepseek/deepseek-r1-distill-llama-70b using OpenRouter while still using the OpenAI protocol? I want to leverage OpenRouter for broader model support.

adapter_id: "openai|..." adapter_metadata: {model: "deepseek/deepseek-r1-distill-llama-70b", api_base: "https://openrouter.ai/api/v1",…} adapter_name: "11" api_base: "https://openrouter.ai/api/v1"

Failed Error testing '11'. Error from LLM provider 'OpenAI'. Unknown model 'deepseek/deepseek-r1-distill-llama-70b'. Please provide a valid OpenAI model name in: o1, o1-2024-12-17, o1-pro, o1-pro-2025-03-19, o1-preview Thanks in advance

FelipeVeiga avatar Jul 16 '25 22:07 FelipeVeiga

I have the same kind of problem with a different setup : I serve llm models and embedding models through OpenWebUI and i create an OpenAI compatible API key here in OpenWebUI. Usually clients can connect to it, but not in this case.

Even if the setup is different, the problem is the same as described in this ticket : The restricted list of model block us from being able to connect. I have tried OpenAI and anyscale provider with my local URL, without success.

stephanepoinsart avatar Sep 01 '25 12:09 stephanepoinsart