Can you please add openrouter LLM option please?
Can you please add openrouter LLM option please.
Can you please add openrouter LLM with Deepseek-v3
manage to fix it myself, but maybe you should consider to let use custom in the .env
and get the results!
You can set OPENAI_ENDPOINT or DEEPSEEK_ENDPOINT in the .env file to https://openrouter.ai/api/v1. Then, in the UI, select the provider as openai (default) or deepseek, respectively.
You can set
OPENAI_ENDPOINTorDEEPSEEK_ENDPOINTin the.envfile tohttps://openrouter.ai/api/v1. Then, in the UI, select the provider asopenai(default) ordeepseek, respectively.
It wokrs!