zlw123

Results 1 issues of zlw123

use llama3 local, OpenDevin in docker start with docker run \ --add-host host.docker.internal=host-gateway \ -e LLM_API_KEY="ollama" \ -e LLM_BASE_URL="http://host.docker.internal:11434" \ -e WORKSPACE_MOUNT_PATH=D:/opendevin/workspace \ -vD:/opendevin/workspace:/opt/workspace_base \ -vD:/opendevin/workspace/docker.sock:/var/run/docker.sock \ -p 3000:3000 \...

question