Zero
Zero copied to clipboard
[Question] Selfhosted AI
Hey there, just found this project, and it looks interesting, but having my email content sent to some 3rd party like OpenAi or Groq is something that won't happen. Is there any chance to get support for self-hosted solutions like Ollama with an appropriate model? A real docker(-compose) setup for testing/deploying also would be nice (not needing bun/Node.js on the host, building/running the app itself in a container too).
Somewhat related: #853 #580
This is a very valid and important point. Supporting self-hosted LLMs like Ollama would be a great step forward for users prioritizing data privacy. A modular setup allowing users to configure their own AI backends (e.g., via ENV variables) offers flexibility. A full Docker-Compose setup that encapsulates the entire app (including build/runtime) without requiring Bun or Node.js on the host would make deployment much smoother, especially for testing in isolated environments. But as for now, the team focuses on resolving the bugs and introducing many more needed features. You can fork the repo and make changes yourself may be.
Appreciate your concern, we are already working on supporting self-hosted AI models for our users who prefer to self-host the app themselves.
That is awesome!
hey i am just curious how is going to work Can we run ollama because you can slef-host ollama and choose and open source ai model. We can use Vercel Ai SDK. I am just excited to know more about this
Hello, just to chime in. Please don't waste time on ollama API support, just allow setting a custom openai API endpoint. Every relevant inference server supports that API, including ollama.
Maybe this already works via an environment variable, depending on the openai API library you used. In that case a little documentation would be all that's needed.
@Papierkorb thank you! we'll get that merged in
Anxiously waiting... 👀
is there a draft? I can take a stab at this
no one is working on this at this moment, up for grabs
@MrgSub still up for grabs? If so I'm gonna start on this. Should be pretty easy to add proxy support with @ai-sdk/openai
This issue is stale (3+ days) and will be closed.
Closing stale issue.