Matt
Matt
Excellent - ideally there would be a robust follow-up / critique that could lead to a re-generation of the output... ideally the output would land in a vectorized library.
I'm running Ollama locally, and would really like to be able to use it as a back end. I can put the URL into the custom base URL field, it...
What about Searxng for local installs: https://github.com/searxng/searxng
+1 This is a great idea... I'll take it one step further. I would like to be able to share the vector store between this app, and something more universal...
I am using [Open Web UI](https://github.com/open-webui/open-webui) to connect to multiple instances of Ollama that are installed on a variety of local PCs (none of which are all that powerful) I'd...
I'm testing solutions now - don't need anything in production in the short term, but it would be great to know if this is something that you'd consider and if...
Looking at [HiveMQ](https://www.hivemq.com/), [EMQx](https://www.emqx.io/) and [AWS IoT Core](https://aws.amazon.com/iot-core/) - All three use MQTT V5
+1 My home-lab has grown more or less organically over the last 10 years... and includes a lot of castoff gaming hardware. It would be great if Ollama could incorporate...
https://www.quivr.com/ perhaps?
Same here - "132" using Docker on an Xeon.