Abdelali Hassouna

Results 2 comments of Abdelali Hassouna

Dosen't work! I recive the following Error. But in the debug console I see the add_llm response return 200. I Have the same thins with other providers such as Anthropic...

@KevinHuSh I just fix it on Windows I leave the guide bellow: 1)Set User Environment Variables for Ollama in : Example: Variable name: OLLAMA_HOST Variable value: 0.0.0.0:11434 link guide: https://github.com/ollama/ollama/blob/main/docs/faq.md#how-do-i-configure-ollama-server...