[FR] Am I missing something with local computation subscription unlocks?
Description
Hey, I've been looking around for a little bit, as a youtuber recently suggested your project as an open source alternative to notion.
However, while it makes TONS of sense to not be able to give free bandwidth to users in the form of infinite cloud AI inference, I'm wondering if I'm misunderstanding something.
If I understand correctly, this is an open source program that, before modification, requires you to pay a SUBSCRIPTION for the ability to run /LOCAL/ LLM inference? I'm really hoping this is incorrect as I haven't even seen such skeevy anti-consumer feature withholding & subscription model abuse from large for profit corpos, let alone a project claiming to be doing something for the sake of open knowledge and creating headway in making open platforms in the interest of making working and creating things more accessible.
Honestly it seems as though it's more likely I'm either naive or completely wrong about this, because of it's absurdity. If it actually is the case, how could this have even happened? Did NOBODY say anything on the team while this was being developed and implemented? I could be way off base here but it feels obscene.
If it's actually locked down, for the love of all things good, allow people to connect a locally hosted API link from ollama or kobold.cpp or something.
Impact
While this project is "open", it seems to lack the fundamental underlying ethos, unless again, I am mistaking something. If this is really the way the program is, it is the biggest example of Libre not Gratis being important I think I have personally ever seen, and I argue that you are doing more harm to the open community by engaging in such actively hostile practices for profit while touting "we're open source!". Honestly makes my skin crawl A LOT worse than a corpo just being greedy and selling data.
Technically as well, I'm not even sure what kind of backend y'all are using because its so locked down, opening your pricing page on your website when you press the toggle for local AI without any warning or anything that that is about to happen is BONKERS. That being said, you wouldn't be stuck updating for every individual model, and users would be able to use whatever finetunes, models, and quants that they'd like to. It would probably be technically superior to the local results that you're getting now, and on top of that, lessen the developer load and not be so... evil.
I really, really hope this actually is considered. It's a cool project and it seems like a lot of people have put in really good work. People deserve compensation but as someone who is trying to teach free software as an educator and poor, this is a baaaaaad look.
Additional Context
No response
@MableMunroe , you're free to modify our source code to connect AppFlowy data with other services. We don't require a subscription for you to do so.
We've made local AI free to everyone. Supported in the upcoming release (v0.8.7) https://appflowy.com/guide/appflowy-local-ai-ollama
Just wanted to say that this was a great response, eventually just pivoting and quietly spending the resources to undo/change the system based on a flag of disapproval from your community.
It's one thing to never do anything wrong in the first place, but MUCH more impressive is to listen to an opposing viewpoint from your users and implement their feedback, especially without stipulation.
I'll definitely continue supporting and orbiting around this now at the very least, but I sincerely hope the effort proves worthwhile for you!