Angel Santiago
Angel Santiago
> Our installation issues have dropped massively since switching to pipx, And most people are installing cleanly now. You probably want to look at cleaning your overall environment to a...
I am running into this issue as well, however, I have an exposed api endpoint for my loaded models using a combination of LM Studio (:1234) and Ollama (:11434). I...
I was able to fix the issue. The format used to add Ollama support is as follows: ` "[email protected]": { "display:name": "ollama - llama3.2", "display:order": 1, "ai:*": true, "ai:baseurl": "http://localhost:11434/v1",...