[Bug]: Issue when running with portainer
Is there an existing issue for the same bug?
- [X] I have checked the troubleshooting document at https://github.com/OpenDevin/OpenDevin/blob/main/docs/guides/Troubleshooting.md
- [X] I have checked the existing issues.
Describe the bug
When Building Portainer Stack with ollama and openwebui these are working as intended. But OpenDevin will throw error.
No connection adapters were found for http://localhost:11434"/api/generate
It seems to OpenDevin sandbox Creation after Stack creation, opendevin sandbox will not receive a ipadress and port assignment.
here is my Stack definition: Stackdefinition.txt
Problem consists in Version: main , 0.3.1
Current Version
main , 0.3.1
Installation and Configuration
here is my Stack definition:
[Stackdefinition.txt](https://github.com/OpenDevin/OpenDevin/files/15051662/Stackdefinition.txt)
Model and Agent
No response
Reproduction Steps
No response
Logs, Errors, Screenshots, and Additional Context
@githubuserAL2024 I see this in your message
'>>> No connection adapters were found for '"[http://localhost:11434"/api/generate](http://localhost:11434%22/api/generate)' <<<
There are some weird quotes inside the URL there--that looks suspicious to me
Hi rbren, forget about the quotes, i put the there whilst trying to format the Text. Clean message Was: No connection adapters were found for [http://localhost:11434"/api/generate]
I did try to finde /api/generate in: OpenDevin OpenDevin_sandbox Ollama
Without success. Btw i did Update my Initial Comment
🤔 there's still a double-quote in your "clean message"
You are right, the quotes in the Error message are weird. i tried to change the Base_URL definition as following
**Base_URL definition Error Message**
- LLM_BASE_URL="http://localhost:11434" ==> '"http://localhost:11434"/api/generate'
- LLM_BASE_URL='http://localhost:11434' ==> " 'http://localhost:11434'/api/generate"
- LLM_BASE_URL= http://localhost:11434 ==> 'NoneType' object has no attribute 'request'
- LLM_BASE_URL= http://localhost:11434/ ==> '"http://localhost:11434/"/api/generate'
below is my initial Stack definition file:
version: '3.8' services: ollama: volumes: - /home/alex-2022/ollama_stack/models:/root/.ollama image: ollama/ollama:latest container_name: ollama restart: always ports: - 11434:11434 environment: - NVIDIA_VISIBLE_DEVICES=GPU-27dc6e4f-5c02-ebc1-d7fa-0b56ae3c10f0 #devices: # - /dev/nvidiactl:/dev/nvidiactl # - /dev/nvidia-uvm:/dev/nvidia-uvm # - /dev/nvidia-uvm-tools:/dev/nvidia-uvm-tools #networks: # ollama: # ipv4_address: 192.168.50.114 deploy: resources: reservations: devices: - driver: nvidia count: 1 capabilities: [gpu]
open-webui: image: ghcr.io/open-webui/open-webui:main container_name: open-webui restart: always ports: - "3501:8080" volumes: - /home/alex-2022/ollama_stack/web_ui:/app/backend/data extra_hosts: - "host.docker.internal:host-gateway"
opendevin:
image: ghcr.io/opendevin/opendevin:main
container_name: opndev_test
restart: always
ports:
- "3000:3000"
extra_hosts:
- "host.docker.internal:host-gateway"
environment:
- LLM_API_KEY="ollama"
- LLM_BASE_URL="http://localhost:11434/"
- LLM_MODEL="ollama/dolphincoder:15B"
- LLM_EMBEDDING_MODEL="dolphincoder:15B"
- MAX_ITERATIONS=5
- WORKSPACE_DIR=$(pwd)/home/alex-2022/ollama_stack/OpenDevin/workspaces
- SANDBOX_TYPE=exec
volumes:
- /home/alex-2022/ollama_stack/OpenDevin/workspaces:/opt/workspace_base
- /var/run/docker.sock:/var/run/docker.sock
Scrreenshot of Portainer Container details:
i will do further testing when i come back from work
Looks like maybe you just want to remove the quotes in your stack definition yaml
This looks like an issue with the way env vars were set. Going to close this one