[FR] Support for using Ollama for AI
Description
Add the ability to use Ollama for AI instead of only openAI.
Impact
People that want to use AI features but want to use a self-hosted AI
Additional Context
No response
i agree with you it would be very convenient to be able to use our selfhosted Llama AI instead of OpenAI
Yea, that would fit in the AppFlowy Vision of owning your Data. They advertise owning your Data and stand for privacy and then they outsource all your Data to OpenAI.
Would like to see this.
https://forum.appflowy.io/t/ollama-integration/997
https://appflowy.io/privacy
See #Interactive Features section in this page.
We and other users of our Website or Services collect the information you submit or make available through these interactive features, including your name, email address, username and any personal information included in the contents, inputs, file uploads, or feedback you provide to these interactive features, as well as in the results or outputs generated from these interactive features (“User Content”).
They are already using your data written in appflowy to their advantage. They are potentially training their model on these data.
https://forum.appflowy.io/t/ollama-integration/997
#4021
Supported in the upcoming release (v0.8.7) https://appflowy.com/guide/appflowy-local-ai-ollama