Macos Ollama : failed to create asking task + timeout
Describe the bug I am following this tutorial: https://blog.getwren.ai/how-to-use-meta-llama-3-to-query-mysql-database-using-ollama-on-your-machine-2c087b204e41 I get this error when I enter prompt and ask a question:
To Reproduce Steps to reproduce the behavior:
- Follow the blog steps to install. Note that it is missing the part about Docker Desktop (not everyone has this by default). My data model loaded fine
- Click on Ask
- Seen error
Expected behavior I expect Wren to presumably answer my question
Screenshots
Ollama is running successfully
Desktop (please complete the following information):
- OS: Macos Sonomas
- Browser Arc (Chrome based)
Wren AI Information
- Version: 0.7.1
- LLM_PROVIDER= ollama_llm
- GENERATION_MODEL= llama3
Additional context Unlike other users, I noticed that after info ui service, other parts did not come. Ai service etc. After waiting for a long time, I got a timeout error.
Logs and my env.ai attached. I think the issue similar to #511
wrenai-ibis-server.log wrenai-wren-ai-service.log wrenai-wren-engine.log wrenai-wren-ui.log
here is .env.ai file env.ai.txt
here is .env.dev file env.dev.txt
@emrecengdev thanks for reaching out! sorry, I couldn't download env.ai.txt file, it said 404. also from the wren ai service log, it seems that you were using OpenAILLM provider instead of Ollama provider. Please check the value of LLM_PROVIDER in .env.ai file. Thank you
@emrecengdev welcome to try the latest release: https://github.com/Canner/WrenAI/releases/tag/0.13.2. Please follow the user guide here to use custom LLM such as Ollama. Now we support LiteLLM, which means you can connect more LLM providers. Enjoy :)
https://docs.getwren.ai/oss/installation/custom_llm#running-wren-ai-with-your-custom-llm-embedder-or-document-store