AGiXT icon indicating copy to clipboard operation
AGiXT copied to clipboard

Docker: Backend dies with error code 132

Open mackworthy opened this issue 1 year ago • 9 comments

Version: ghcr.io/josh-xt/agent-llm-backend:v1.0.7

Reproduction:

  • docker compose up --build
  • Navigate to web interface
  • Type a task into the "Provide agent with objective" field and submit
  • Backend unexpectedly exits

Logs:

agent-llm-frontend-1  | yarn run v1.22.19
agent-llm-frontend-1  | $ next start
agent-llm-frontend-1  | ready - started server on 0.0.0.0:3000, url: http://localhost:3000
agent-llm-backend-1   | INFO:     Started server process [1]
agent-llm-backend-1   | INFO:     Waiting for application startup.
agent-llm-backend-1   | INFO:     Application startup complete.
agent-llm-backend-1   | INFO:     Uvicorn running on http://0.0.0.0:5000 (Press CTRL+C to quit)
agent-llm-backend-1   | playsound is relying on another python subprocess. Please use `pip install pygobject` if you want playsound to run more efficiently.
agent-llm-backend-1   | INFO:     172.26.0.1:54466 - "GET /api/agent HTTP/1.1" 200 OK
agent-llm-backend-1   | INFO:     172.26.0.1:54476 - "GET /api/agent/Agent-LLM HTTP/1.1" 200 OK
agent-llm-backend-1   | INFO:     172.26.0.1:54490 - "GET /api/agent/Agent-LLM/command HTTP/1.1" 200 OK
Downloading (…)e9125/.gitattributes: 100%|██████████| 1.18k/1.18k [00:00<00:00, 6.21MB/s]
Downloading (…)_Pooling/config.json: 100%|██████████| 190/190 [00:00<00:00, 442kB/s]
Downloading (…)7e55de9125/README.md: 100%|██████████| 10.6k/10.6k [00:00<00:00, 16.3MB/s]
Downloading (…)55de9125/config.json: 100%|██████████| 612/612 [00:00<00:00, 1.21MB/s]
Downloading (…)ce_transformers.json: 100%|██████████| 116/116 [00:00<00:00, 239kB/s]
Downloading (…)125/data_config.json: 100%|██████████| 39.3k/39.3k [00:00<00:00, 8.81MB/s]
Downloading pytorch_model.bin: 100%|██████████| 90.9M/90.9M [00:01<00:00, 60.9MB/s]
Downloading (…)nce_bert_config.json: 100%|██████████| 53.0/53.0 [00:00<00:00, 81.2kB/s]
Downloading (…)cial_tokens_map.json: 100%|██████████| 112/112 [00:00<00:00, 226kB/s]
Downloading (…)e9125/tokenizer.json: 100%|██████████| 466k/466k [00:00<00:00, 9.72MB/s]
Downloading (…)okenizer_config.json: 100%|██████████| 350/350 [00:00<00:00, 673kB/s]
Downloading (…)9125/train_script.py: 100%|██████████| 13.2k/13.2k [00:00<00:00, 17.8MB/s]
Downloading (…)7e55de9125/vocab.txt: 100%|██████████| 232k/232k [00:00<00:00, 3.01MB/s]
Downloading (…)5de9125/modules.json: 100%|██████████| 349/349 [00:00<00:00, 1.54MB/s]
agent-llm-backend-1   | Using embedded DuckDB with persistence: data will be stored in: agents/Agent-LLM/memories
agent-llm-backend-1 exited with code 132

.env:

AGENT_NAME=Agent-LLM
WORKING_DIRECTORY=WORKSPACE
OBJECTIVE=Write an engaging tweet about AI.
INITIAL_TASK=Develop an initial task list.
AI_PROVIDER=oobabooga
AI_MODEL=vicuna
AI_TEMPERATURE=0.5
MAX_TOKENS=2040
AI_PROVIDER_URI=http://192.168.1.23:7860
COMMANDS_ENABLED=True
NO_MEMORY=False
USE_LONG_TERM_MEMORY_ONLY=False
USE_BRIAN_TTS=True
USE_MAC_OS_TTS=False

mackworthy avatar Apr 25 '23 09:04 mackworthy