pegostar

Results 4 comments of pegostar

Changing from -e LLM_BASE_URL="http://localhost:11434" to -e LLM_BASE_URL="http://host.docker.internal:11434" It keeps on giving .... 09:40:18 - PLAN write a bash script that prints hello 09:40:20 - ACTION AgentThinkAction(thought="Let's review the previous actions...

@rbren I use Windows. And I use this command docker run -e LLM_API_KEY="" -e LLM_MODEL="ollama/gemma:2b" -e LLM_EMBEDDING_MODEL="gemma:2b" -e LLM_BASE_URL="http://localhost:11434" -e WORKSPACE_DIR="C:\Projects\IA\Workspace" -e SANDBOX_TYPE="exec" -e WORKSPACE_MOUNT_PATH="C:\Projects\IA\Workspace" -v "C:\Projects\IA\Workspace":/opt/workspace_base -v /var/run/docker.sock:/var/run/docker.sock -p...

@gtsop-d If you prefer, you can open a new bug. I chose gemma2 because it was the fastest among the models. If I use another model the problem does not...

@enyst If I try to call Ollama bees I will be responsive. See the beginning of the messages