WrenAI icon indicating copy to clipboard operation
WrenAI copied to clipboard

Macos Ollama : failed to create asking task + timeout

Open emrecengdev opened this issue 1 year ago • 1 comments

Describe the bug I am following this tutorial: https://blog.getwren.ai/how-to-use-meta-llama-3-to-query-mysql-database-using-ollama-on-your-machine-2c087b204e41 I get this error when I enter prompt and ask a question:

CleanShot 2024-07-31 at 17 16 07

To Reproduce Steps to reproduce the behavior:

  1. Follow the blog steps to install. Note that it is missing the part about Docker Desktop (not everyone has this by default). My data model loaded fine
  2. Click on Ask
  3. Seen error

Expected behavior I expect Wren to presumably answer my question

Screenshots Ollama is running successfully CleanShot 2024-07-31 at 17 19 42

Desktop (please complete the following information):

  • OS: Macos Sonomas
  • Browser Arc (Chrome based)

Wren AI Information

  • Version: 0.7.1
  • LLM_PROVIDER= ollama_llm
  • GENERATION_MODEL= llama3

Additional context Unlike other users, I noticed that after info ui service, other parts did not come. Ai service etc. After waiting for a long time, I got a timeout error.

CleanShot 2024-07-31 at 17 13 55

Logs and my env.ai attached. I think the issue similar to #511

wrenai-ibis-server.log wrenai-wren-ai-service.log wrenai-wren-engine.log wrenai-wren-ui.log

here is .env.ai file env.ai.txt

here is .env.dev file env.dev.txt

emrecengdev avatar Jul 31 '24 14:07 emrecengdev

@emrecengdev thanks for reaching out! sorry, I couldn't download env.ai.txt file, it said 404. also from the wren ai service log, it seems that you were using OpenAILLM provider instead of Ollama provider. Please check the value of LLM_PROVIDER in .env.ai file. Thank you

cyyeh avatar Jul 31 '24 14:07 cyyeh

@emrecengdev welcome to try the latest release: https://github.com/Canner/WrenAI/releases/tag/0.13.2. Please follow the user guide here to use custom LLM such as Ollama. Now we support LiteLLM, which means you can connect more LLM providers. Enjoy :)

https://docs.getwren.ai/oss/installation/custom_llm#running-wren-ai-with-your-custom-llm-embedder-or-document-store

cyyeh avatar Dec 23 '24 08:12 cyyeh