AutoGPT
AutoGPT copied to clipboard
In a new installation the OLlama integration doesn't work correctly
⚠️ Search for existing issues first ⚠️
- [X] I have searched the existing issues, and there is no existing issue for my problem
Which Operating System are you using?
Windows Subsystem for Linux (WSL)
Which version of AutoGPT are you using?
Master (branch)
What LLM Provider do you use?
Other (detail in issue)
Which area covers your issue best?
Agents
What commit or version are you using?
06b403f2b05090546ac6edc96e96b2f74602fd9c
Describe your issue.
After doing a full install of AutoGPT (advanced setup), I'm trying to follow the documentation to set up the integration with OLlama. I do everything according to the documentation, but when I select the appropriate model from the block dropdown menu, I get an error saying that the credentials aren't input.
See image for confirmation that I know which models from the dropdown to select.
Upload Activity Log Content
No response
Upload Error Log Content
No response
hi can i work on this issue. please assign it to me :) @ntindle
Hi @deepchanddc22,
Thanks for your interest, that would be great, thank you! Looking forward to your contribution.
Hey @deepchanddc22 just checking to see if you were able to get to this issue or ran into any issues?
I need to update the docs for ollama, i found a setup issue where we say to use o.o.o.o and then the port, but we cant to that as everything is in docker, so it has to use the external ip of the system running ollama to connect, i made a little guide here
but it should not be asking for credentials? if it is let me know
- Stop Ollama if it is currently running.
- Open Command Prompt (CMD) and run:
set OLLAMA_HOST=0.0.0.0:11434
- Start the Ollama server with:
ollama serve
This will make Ollama accessible on the local network.
4. Open a new CMD window and find your local IP using:
ipconfig
Look for your IPv4 address (e.g., 192.168.0.39).
5. Set the Ollama host to your local IP with port 11434:
192.168.0.39:11434
- Now, try running your graph or model again.
Note: Not all models work with Ollama. Here are the models that do:
llama3.2llama3llama3.1:405bdolphin-mistral:latest
Bentlybros fix works brilliant, only adjustment i required is to enter
export OLLAMA_HOST=0.0.0.0:11434
instead of set.
I'm happy with this solution, thanks!