plane
plane copied to clipboard
[feature]: Local-AI support
Is there an existing issue for this?
- [X] I have searched the existing issues
Summary
Support self-hosted LLM endpoints, such as Ollama, next to OpenAI's ChatGPT, to allow a fully self-hosted experience.
Why should this be worked on?
Privacy of prompts, OpenAI fees, reduce dependence on cloud services.
Thanks for talking with me and opening this issue!