Diego Calderon
Diego Calderon
> in your cmd, try going to this folder `D:\AIhuihua\buquan\stable-diffusion-webui\repositories\stable-diffusion-stability-ai` > > Next, checkout `main` branch `git checkout main` > > Update repo `git pull` > > Check the log...
Would it make sense to add Ollama model configuration within the Build -> Models tab? The only method I know of uses is supported by LiteLLM, e.g. [https://gist.github.com/shimomurakei/5692335e4fdfb12450afa2580002db77](https://gist.github.com/shimomurakei/5692335e4fdfb12450afa2580002db77) but UI...
> Hi @romilan24 , > > Thanks for the note. Ollama does provide an openai compatible api. https://ollama.com/blog/openai-compatibility. This way, if you put in your model name, base url and...
For Ollama use `llm = ChatOpenAI(api_key='ollama', model="llama3", base_url='http://localhost:11434/v1', temperature=4)`