Self-deploy with non-docker option
不太习惯使用 docker
Thank you very much for developing such a useful tool. For privacy reasons, I prefer using local independent apps. Additionally, would it be possible for this tool to support inference with the local ollama model in the future?
hello @vanch007 @chllei,
Thank you very much for sharing your feedback. We truly appreciate it! Here's an overview of our preliminary plan for incrementally enhancing self-hosting support:
- Local Machine Deployment via Docker Containers: This is already available. You can now deploy Refly on your local machine using Docker containers, providing you with a straightforward way to get started with self-hosting.
- Fully Customizable Model Providers (Utilizing Containers): We're working on allowing you to have complete control over your inference model providers. This will still be container-based. Our goal is to include support for Ollama, and we anticipate this feature to be ready by the middle of February 2025.
- Fully-functional Native Application with One-click Install: We're also developing a local-first native application that will offer a seamless installation experience with just one click. This application will be fully functional without the need for remote server, providing you with all the features of Refly in a more integrated and privacy-focused way. We expect to release this by the end of March 2025.
We'll keep you updated as we progress towards these milestones. If you have any further questions or suggestions in the meantime, please don't hesitate to let us know.
What is the git branch for Fully-functional Native Application with One-click Install? Hope we can contribute on it if possible.
Hello, I am new to Open Source and I am interested in working on this issue. I can start by improving the documentation for deployment without Docker or exploring UI settings. Any recommendations to start?
Thanks for your passion. You can contribute to our documentation in https://github.com/refly-ai/refly-docs.