anything-llm icon indicating copy to clipboard operation
anything-llm copied to clipboard

anythingllm container not able to communicate with local ollama/lacalai models running on local host on Ubuntu

Open bokey007 opened this issue 1 year ago • 4 comments

          its the same issue with me

Primary server in HTTP mode listening on port 3001 TypeError: fetch failed at Object.fetch (node:internal/deps/undici/undici:11730:11) at process.processTicksAndRejections (node:internal/process/task_queues:95:5) at async ollamaAIModels (/app/server/utils/helpers/customModels.js:84:18) at async getCustomModels (/app/server/utils/helpers/customModels.js:20:14) at async /app/server/endpoints/system.js:769:35 { cause: Error: getaddrinfo ENOTFOUND host.docker.internal at GetAddrInfoReqWrap.onlookup [as oncomplete] (node:dns:107:26) { errno: -3008, code: 'ENOTFOUND', syscall: 'getaddrinfo', hostname: 'host.docker.internal' } }

@timothycarambat I am facing this issue with both ollama and localAi

I think, this problem is specifically for linux machines.

Originally posted by @bokey007 in https://github.com/Mintplex-Labs/anything-llm/issues/495#issuecomment-1890304226

bokey007 avatar Jan 13 '24 12:01 bokey007

Screenshot 2024-01-13 235745

Did you try this? It solved the issue for me.

sumitsodhi88 avatar Jan 13 '24 18:01 sumitsodhi88

Hi @sumitsodhi88 appreciate the response, yes I did. It did not work.

What was the operating system, u were working on?

bokey007 avatar Jan 13 '24 18:01 bokey007

Hi @sumitsodhi88 appreciate the response, yes I did. It did not work.

What was the operating system, u were working on?

I am using windows 11. I am a very newbie so all I did was format my pc. Installed wsl. Ran one command for Nvidia support. Installed Ollama in docker only. Can't tell what I did right 🤣

Maybe if you install ollama in docker it might work.

sumitsodhi88 avatar Jan 13 '24 18:01 sumitsodhi88

This issue is certainly on your side and has to do with networking. So host.docker.internal is a special name when used within a docker container that allows it to access the host system localhost. However, both applications need to be running on the same machine. Alternatively, you can put the host machine's local IP as the address and it should still resolve. ~I do not think host.docker.internal works on ubuntu machines, as its part of docker engine.~

timothycarambat avatar Jan 13 '24 19:01 timothycarambat

If it helps with anything, the host.docker.internal addres works for me in my linux machine, in my case Fedora 39 and Lm studio.

Botoni avatar Jan 18 '24 01:01 Botoni