anything-llm icon indicating copy to clipboard operation
anything-llm copied to clipboard

[BUG]: When deploying AnythingLLM using docker-compose.yml and accessing directly on port 3001 without using an Nginx proxy, the error message 'Could not respond to message. An error occurred while streaming response. network error'

Open showcup opened this issue 10 months ago • 1 comments

How are you running AnythingLLM?

Docker (remote machine)

What happened?

When deploying AnythingLLM using docker-compose.yml and accessing directly on port 3001 without using an Nginx proxy, the error message 'Could not respond to message. An error occurred while streaming response. network error'

docker-compose.yml

version: '3.8'
services:
  anythingllm:
    image: mintplexlabs/anythingllm
    container_name: anythingllm
    ports:
    - "3001:3001"
    cap_add:
      - SYS_ADMIN
    user: "${UID}:${GID}"
    environment:
    # Adjust for your environment
      - STORAGE_DIR=/app/server/storage
    env_file:
      - .env
    volumes:
      - ./data:/app/server/storage
      - ./.env:/app/server/.env
    restart: always
    networks:
      - anything-llm
    extra_hosts:
      - "host.docker.internal:host-gateway"
networks:
  anything-llm:
    driver: bridge

.env

LLM_PROVIDER='ollama'
EMBEDDING_MODEL_PREF='nomic-embed-text:latest'
OLLAMA_BASE_PATH='http://192.168.31.2:11434'
OLLAMA_MODEL_PREF='deepseek-r1:7b'
OLLAMA_MODEL_TOKEN_LIMIT='4096'
EMBEDDING_ENGINE='ollama'
EMBEDDING_BASE_PATH='http://192.168.31.2:11434'
EMBEDDING_MODEL_MAX_CHUNK_LENGTH='8192'
JWT_SECRET='make this a large list of random numbers and letters 20+'
STORAGE_DIR='/app/server/storage'

Error screenshot

Image

Are there known steps to reproduce?

No response

showcup avatar Feb 21 '25 08:02 showcup

All of this looks right at first glance, can you pull the container logs during the error? This should give an indication as to what is going wrong throwing that error.

timothycarambat avatar Feb 22 '25 18:02 timothycarambat

All of this looks right at first glance, can you pull the container logs during the error? This should give an indication as to what is going wrong throwing that error.

Step 1: docker logs -f anythingllm, as shown below

Image

Step 2: When I type something into the chat box, the terminal automatically stops and the docker container adds a few errors

Image

showcup avatar Feb 26 '25 00:02 showcup

Almost certainly the pinned issue - https://github.com/Mintplex-Labs/anything-llm/issues/1331

TLDR; CPU is too old and does not support AVX2 instruction set. Can you confirm the type of CPU running the machine? This has to do with LanceDB as the vector db

timothycarambat avatar Feb 26 '25 01:02 timothycarambat

Almost certainly the pinned issue - #1331

TLDR; CPU is too old and does not support AVX2 instruction set. Can you confirm the type of CPU running the machine? This has to do with LanceDB as the vector db

CPU model is intel I5 10400T QSRL

showcup avatar Feb 26 '25 02:02 showcup

If you use another vector database solution does this error go away? Otherwise, this is something else entirely causing the container to exit.

timothycarambat avatar Feb 26 '25 20:02 timothycarambat

If you use another vector database solution does this error go away? Otherwise, this is something else entirely causing the container to exit.

Ok, I will try using another vector database, think you very much .

showcup avatar Feb 28 '25 01:02 showcup

If you use another vector database solution does this error go away? Otherwise, this is something else entirely causing the container to exit.

Ok, I will try using another vector database, think you very much .

@showcup any report on this? using other vector database works for you?

pvmilk avatar Sep 11 '25 20:09 pvmilk