Max retries exceeded with url: /api/embeddings (Caused by NewConnectionError....)
I am using ollama with the llama2:latest model locally. When attempting to issue instructions to the OpenDevin UI, I see the following error:
AGENT ERROR:
HTTPConnectionPool(host='localhost', port=11434): Max retries exceeded with url: /api/embeddings (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f2d27d0c310>: Failed to establish a new connection: [Errno 111] Connection refused'))
Here is my config.toml content:
LLM_API_KEY="na"
WORKSPACE_DIR="./workspace"
LLM_BASE_URL="http://localhost:11434"
LLM_MODEL="ollama/llama2"
LLM_EMBEDDING_MODEL="llama2"
Thoughts?
Same for me; any update?
Hey all--try this guide: https://github.com/OpenDevin/OpenDevin/blob/main/docs/documentation/LOCAL_LLM_GUIDE.md
If you continue to run into issues, feel free to open a new bug
Link doesn't work
@hpdhillon Please see here, the documentation has moved to our documentation site: https://opendevin.github.io/OpenDevin/modules/usage/llms/localLLMs
@m2web I had the same issue and then I realised that Ollama was not running on my local. I forgot to switch it on so it was not able to get the response from the API endpoint.