OpenHands icon indicating copy to clipboard operation
OpenHands copied to clipboard

local llama3 but log show gpt-3.5-turbo

Open zlw123 opened this issue 1 year ago • 4 comments

use llama3 local, OpenDevin in docker start with

docker run
--add-host host.docker.internal=host-gateway
-e LLM_API_KEY="ollama"
-e LLM_BASE_URL="http://host.docker.internal:11434"
-e WORKSPACE_MOUNT_PATH=D:/opendevin/workspace
-vD:/opendevin/workspace:/opt/workspace_base
-vD:/opendevin/workspace/docker.sock:/var/run/docker.sock
-p 3000:3000
ghcr.io/opendevin/opendevin:main

error:

root@555c6b64db3f:/app/logs# cat opendevin_2024-04-27.log 12:11:49 - opendevin:ERROR: auth.py:31 - Invalid token 12:11:49 - opendevin:INFO: listen.py:75 - Invalid or missing credentials, generating new session ID: d59448f4-6979-4cbe-9c3e-a34f3c4bea4f 12:11:50 - opendevin:INFO: agent.py:144 - Creating agent MonologueAgent using LLM gpt-3.5-turbo 12:11:50 - opendevin:INFO: llm.py:52 - Initializing LLM with model: gpt-3.5-turbo 12:11:50 - opendevin:ERROR: ssh_box.py:69 - Please check Docker is running using docker ps. 12:11:50 - opendevin:ERROR: agent.py:155 - Error creating controller: Error while fetching server API version: ('Connection aborted.', ConnectionRefusedError(111, 'Connection refused')) Traceback (most recent call last): File "/app/.venv/lib/python3.12/site-packages/urllib3/connectionpool.py", line 793, in urlopen response = self._make_request( ^^^^^^^^^^^^^^^^^^^ File "/app/.venv/lib/python3.12/site-packages/urllib3/connectionpool.py", line 496, in _make_request conn.request( File "/app/.venv/lib/python3.12/site-packages/urllib3/connection.py", line 400, in request self.endheaders() File "/usr/local/lib/python3.12/http/client.py", line 1331, in endheaders self._send_output(message_body, encode_chunked=encode_chunked) File "/usr/local/lib/python3.12/http/client.py", line 1091, in _send_output self.send(msg) File "/usr/local/lib/python3.12/http/client.py", line 1035, in send self.connect() File "/app/.venv/lib/python3.12/site-packages/docker/transport/unixconn.py", line 27, in connect sock.connect(self.unix_socket) ConnectionRefusedError: [Errno 111] Connection refused

zlw123 avatar Apr 27 '24 12:04 zlw123

@zlw123 you need to set the model in the settings modal in the UI. There's a gear wheel in the bottom right

rbren avatar Apr 27 '24 13:04 rbren

@zlw123 You will need to use ollama/llama3 as name

  1. The gear is in the bottom right Screenshot 2024-04-27 at 16 21 19

  2. The modal in the UI. You might need to type ollama/llama3 because it does not appears in the list, also it doesn't work currently with simple llama3 you will need full ollama/llama3. Screenshot 2024-04-27 at 16 20 55

isavita avatar Apr 27 '24 15:04 isavita

Can you please confirm, you have used llama3 this way, @isavita ?

enyst avatar Apr 27 '24 15:04 enyst

@enyst This is from my terminal log

INFO:     connection open
Starting loop_recv for sid: 1120a483-1c24-427d-b1fc-a06942e70053
INFO:     192.168.65.1:64810 - "GET /api/refresh-files HTTP/1.1" 200 OK
15:20:23 - opendevin:INFO: agent.py:144 - Creating agent MonologueAgent using LLM ollama/llama3
15:20:23 - opendevin:INFO: llm.py:52 - Initializing LLM with model: ollama/llama3
15:20:23 - opendevin:INFO: ssh_box.py:353 - Container stopped
15:20:23 - opendevin:WARNING: ssh_box.py:365 - Using port forwarding for Mac OS. Server started by OpenDevin will not be accessible from the host machine at the moment. See https://github.com/OpenDevin/OpenDevin/issues/897 for more information.
15:20:23 - opendevin:INFO: ssh_box.py:373 - Mounting workspace directory: /Users/isavita/code/workspace
15:20:23 - opendevin:INFO: ssh_box.py:396 - Container started
15:20:24 - opendevin:INFO: ssh_box.py:413 - waiting for container to start: 1, container status: running
15:20:25 - opendevin:INFO: ssh_box.py:178 - Connecting to [email protected] via ssh. If you encounter any issues, you can try `ssh -v -p 53053 [email protected]` with the password 'e9510b7f-aa92-490a-92e6-494f9461658d' and report the issue on GitHub.
INFO:     192.168.65.1:64810 - "GET /api/litellm-models HTTP/1.1" 200 OK
INFO:     192.168.65.1:64814 - "GET /api/messages/total HTTP/1.1" 200 OK
INFO:     192.168.65.1:64814 - "GET /api/agents HTTP/1.1" 200 OK


==============
STEP 0

15:21:48 - PLAN
write me working snake game
15:22:02 - ACTION
FileReadAction(path='/workspace/snake_game.js', start=0, end=-1, thoughts='', action='read')
15:22:02 - OBSERVATION
File not found: /workspace/snake_game.js



==============
STEP 1

15:22:02 - PLAN
write me working snake game
15:22:09 - ACTION
AgentThinkAction(thought='I should probably install a package manager like npm to manage my dependencies for the snake game. I can then use it to install any necessary packages.', action='think')


==============
STEP 2

15:22:09 - PLAN
write me working snake game
15:22:17 - ACTION
AgentThinkAction(thought="I should probably start by installing Node.js, since it's a popular choice for building JavaScript applications. And then I can use npm to install the necessary packages for the snake game.", action='think')


==============
STEP 3

15:22:17 - PLAN
write me working snake game
15:22:24 - ACTION
AgentThinkAction(thought='I should probably start by setting up Node.js on my system. I can use `apt` to install it.', action='think')

....

isavita avatar Apr 27 '24 15:04 isavita

Looks like this was solved!

rbren avatar May 02 '24 02:05 rbren