subvert
subvert copied to clipboard
Request to add support for locally hosted ollama
It would be very helpful if we could use local hosted models with ollama
Same here, configuring local LLMs will unlock a lot of more users I think. It would be awesome if an env variable config could be added.