void
void copied to clipboard
Add OpenRouter hosting option
Allow users to host a model using OpenRouter.
llama.cpp directly or via Local.Ai would be nice too
Happy to help with this! OpenRouter is openai-compatible: https://openrouter.ai/docs
Added an integration using the openai API!