Chris Van Pelt (CVP)
Chris Van Pelt (CVP)
Try replacing the git weave URL in pyproject.toml with "weave" and it should work.
That could be possible in the future. Today it must be a framework the LLM read about in its training data like React or Svelte.
Sorry @rossman22590 been swamped with various things. There's a `fly.toml` file in the backend directory. You'll need to modify that to be the domain you end up using on fly...
It's fixed in master. I'll cut a new release. Originally I called the project "Elemint" like a thing that "mint's" elements. Alas, that name was lame 😝
Love this! Thanks @mmuyakwa. I'll look into making this configurable / extensible.
No need for an API key. Just set `OLLAMA_HOST` and choose a model from the settings pane.
@sokoow you can just set the OPENAI_API_KEY to something like `xxx` if you don't want to use that API. If you're seeing Ollama models in the list the application is...
Hey Paul, if you're running Ollama on localhost you'll like need to set `OLLAMA_HOST=http://host.docker.internal:11434` because docker is running from within a VM that has a different localhost (unless you're running...
Nice! Glad that worked for you.
Gotcha, I'm not using poetry currently (but welcome a contribution). I created a [pyenv-virtualenv](https://github.com/pyenv/pyenv-virtualenv) named OpenUI that points to Python 3.11 I believe. I'll look into Poetry or i've heard...