Damien
Damien
Having this app on mac via `brew` would be great. Makes life much easier
I would expand this into be able to have different versions of DB and then switch from the "old" version to the most recently new version preserving preserving old connections...
You might also be interested in these models that you can run locally and they can generate SQL: https://ollama.com/library/sqlcoder https://ollama.com/library/codeqwen https://ollama.com/library/starcoder2
Might also be useful for this project since WrenAi has DuckDB already. [duckdb-nsql model](https://ollama.com/library/duckdb-nsql)
> We'll merge the `add-ollama` branch to the main branch after we make sure it won't break our ai pipelines currently. We will investigate some ways to solve the issue....
FYI, there are two most popular inference engines: [Ollama](https://ollama.com/) (partly compatible with OpenAI APIs), mostly using its own APIs. and [LocalAI](https://localai.io/) (Tend to be almost fully compatible with OpenAI APIs)...
Also, I want to know if the bootstrap init script is intended to be used with UI only, the engine, or both. I'm considering deploying this on my k8s, and...
figured this out: `PG_URL` https://github.com/Canner/WrenAI/blob/main/deployment/kustomizations/examples/secret-wren_example.yaml#L22
I have the same issue. It seems like LocalAI requires a specific version of the protobuf library, that is not installed with Brew. The problem could be with brew not...
Would that be of any help the [LM Studio has implemented MLX](https://github.com/lmstudio-ai/mlx-engine).