Bushra

Results 4 comments of Bushra

Have you tried setting the local llm as the default from the user interface?

I managed to use the official ollama image (ollama/ollama) and not litellm/ollama. also (if you still haven't), try adding ``` extra_hosts: - "host.docker.internal:host-gateway" ``` on the ollama service to allow...

yeah there is: https://docs.danswer.dev/gen_ai_configs/ollama

specify the API base in the ui, should be the same as GEN_AI_API_ENDPOINT. so try http://host.docker.internal:11434