WrenAI icon indicating copy to clipboard operation
WrenAI copied to clipboard

Ollama fails on MacOS

Open slum44 opened this issue 1 year ago • 10 comments

Describe the bug I am following this tutorial: https://blog.getwren.ai/how-to-use-meta-llama-3-to-query-mysql-database-using-ollama-on-your-machine-2c087b204e41

and I get the following error when trying to ask a question image

To Reproduce Steps to reproduce the behavior:

  1. Follow the blog steps to install. Note that it is missing the part about Docker Desktop (not everyone has this by default). My data model loaded fine
  2. Click on Ask
  3. See error

Expected behavior I expect Wren to presumably answer my question

Screenshots Ollama is running successfully image

**Container Logs ** You can execute the following command to get the logs of containers and provide them here:

docker logs wrenai-wren-ui-1 >& wrenai-wren-ui.log && \
docker logs wrenai-wren-ai-service-1 >& wrenai-wren-ai-service.log && \
docker logs wrenai-wren-engine-1 >& wrenai-wren-engine.log && \
docker logs wrenai-ibis-server-1 >& wrenai-ibis-server.log

Desktop (please complete the following information):

  • OS: [e.g. iOS] MacOS
  • Browser [e.g. chrome, safari] Chrome

Wren AI Information

  • Version: [e.g, 0.1.0] 0.7.1
  • LLM_PROVIDER= ollama_llm
  • GENERATION_MODEL= llama3:70b

Additional context Add any other context about the problem here.

Logs and my env.ai attached. I think the issue similar to https://github.com/Canner/WrenAI/issues/494

I noticed that the docker.compose file doesn't have any references to OLLAMA even though I chose OLLAMA / custom provider on setup. I've also included the docker compose file

wrenai-wren-ui.log wrenai-wren-engine.log

wrenai-wren-ai-service.log [wrenai-ibis-server.log](https://github.com/user-attachments/files/161970 docker-compose.llm.yaml.zip env.ai.zip 91/wrenai-ibis-server.log)

wrenai-ibis-server.log

Note that I also got a button / prompt to redeploy but that also failed

slum44 avatar Jul 12 '24 16:07 slum44

@slum44 hi, thanks for reaching out! I've checked your .env.ai, and I found the possible reason of the error is the value of OLLAMA_URL. It should be the default value http://host.docker.internal:11434, not http://localhost:11434. Since Wren AI is running in docker, so it couldn't access directly access Ollama using localhost. host.docker.internal is the method provided by Docker to access localhost on host machine.

cyyeh avatar Jul 12 '24 17:07 cyyeh

@cyyeh thanks for responding so quickly. I've changed my .env.ai file as you have advised and I've restarted the docker container but the error is still the same

image

image

slum44 avatar Jul 12 '24 17:07 slum44

@slum44 could you try to restart the launcher again instead of just restating the failed container?

cyyeh avatar Jul 12 '24 17:07 cyyeh

@cyyeh absolutely I am happy to try. I reran wren-launcher-darwin.sh and terminal is ok (s/shot below) but still the same error from browser unfortunately

image

slum44 avatar Jul 12 '24 17:07 slum44

@slum44 could you try to go to the modeling page and try to redeploy the model? (there should be a deploy button at the top right of the modeling page).

Could you provide the ai-service logs again, thanks?

cyyeh avatar Jul 12 '24 17:07 cyyeh

@cyyeh done - this time a new error (screenshot below)

If it helps, the first time I ran it sees the database ok as I can see all the tables on the left hand side of the modelling page

image

slum44 avatar Jul 12 '24 17:07 slum44

Also it seems something wrong with the port 5555, it seems that there is already one process running using port 5555.

Or would you mind join our Discord server: https://discord.gg/5DvshJqG8Z? and we can take a look together?

cyyeh avatar Jul 12 '24 18:07 cyyeh

Oh, I think you need to delete the process running on 5555 first, and restart launcher again.

cyyeh avatar Jul 12 '24 18:07 cyyeh

@slum44 I think we can add the functionality that we automatically pull the ollama models users chose if they are not pulled yet! I suppose the user experience will be much better!

cyyeh avatar Jul 12 '24 21:07 cyyeh

@slum44 could I close this issue?

cyyeh avatar Jul 15 '24 08:07 cyyeh