WrenAI icon indicating copy to clipboard operation
WrenAI copied to clipboard

Issue with Thread Persistence and Adding Threads in Wren AI on Ubuntu 20.04 with Docker Compose

Open UMANET123 opened this issue 9 months ago • 6 comments

  1. Linux Distribution and Version: We are using Ubuntu 20.04 for our setup.

  2. Wren AI Version and Deployment Method: We have installed and configured the Wren AI application on an Ubuntu server using Docker Compose. The application is up and running successfully. Below are the versions in use:

WREN_PRODUCT_VERSION: 0.15.3 WREN_ENGINE_VERSION: 0.13.1 WREN_AI_SERVICE_VERSION: 0.15.7 IBIS_SERVER_VERSION: 0.13.1 WREN_UI_VERSION: 0.20.1 WREN_BOOTSTRAP_VERSION: 0.1.5 3. Issue Description: After setting up Wren AI on an Ubuntu VM and running it with Docker containers, I encountered two issues while testing the functionality through the Wren AI UI:

Issue 1: I’m unable to add new threads. I receive a "Failed to add thread" error. Issue 2: After restarting the containers, all previously stored threads disappear from the UI. 4. Request: Is there a way to preserve these threads and keep them active even after restarts or deployments? I would like to ensure that threads persist across container restarts.

Image

UMANET123 avatar Mar 21 '25 09:03 UMANET123

Could you share your logs with us with the following command:

docker logs wrenai-wren-ui-1 >& wrenai-wren-ui.log && \
docker logs wrenai-wren-ai-service-1 >& wrenai-wren-ai-service.log && \
docker logs wrenai-wren-engine-1 >& wrenai-wren-engine.log && \
docker logs wrenai-ibis-server-1 >& wrenai-ibis-server.log

wwwy3y3 avatar Mar 21 '25 13:03 wwwy3y3

Application data, including threads, should be persisted in sqlite by default.

As you can see from our docker-compose, we have created a volume here: https://github.com/Canner/WrenAI/blob/c0bfe2b72296055d0806814f4e7d8cbefa064e71/docker/docker-compose.yaml#L3-L4

and mounted to wren-ui-service here: https://github.com/Canner/WrenAI/blob/c0bfe2b72296055d0806814f4e7d8cbefa064e71/docker/docker-compose.yaml#L113-L114

It's the volume Wren AI is using on docker desktop: Image

Could you confirm that the docker volume is indeed created, and mounted to wren-ui-service ? Maybe you could share your docker-compose file with us as well. I'm thinking maybe there are other reasons causing the volume to be deleted everytime you restart the services.

wwwy3y3 avatar Mar 21 '25 13:03 wwwy3y3

Thank you for the update. We made changes to the Docker Compose wren-ui configuration by modifying the volumes to data:/app/data and restarted the Docker container. During the restart, we noticed that one file was missing. To resolve this, we added a file (such as 20250102074256_create_sql_pair_table.js) to the wren-ui code, which was configured based on one of our testing environments. This change was made after we identified where the threads are stored.

We’ve found that the threads are stored in the SQLite3 database in the "thread" table, the thread responses in the "thread_response" table, and the default questions in the "project" table. We are currently working on how to configure the backup and restore processes during a Docker restart.

Could you kindly share how we can take backups and restore the threads and thread_responses?

Note: Is there a limit on the number of threads we can store? We’ve created more than 24 threads, and after that, we are unable to create new ones. We are using an OpenAI key. Could you please suggest how we can resolve this issue?

UMANET123 avatar Mar 24 '25 08:03 UMANET123

@UMANET123

Is there a limit on the number of threads we can store

No, there's no limit. If you encounter any issues creating threads, you could share the logs with us. I would suggest using our stable release and don't add migration file by yourself.

At this point, I'll need logs to investigate this issue further.

About backup

1. Stick to SQLite

If you'd like to stick to sqlite, you could simply backup the sqlite data file.

2. Switch to a managed database service

You could also change the database to PostgreSQL, which could be a managed service on AWS/GCP. A managed database on Cloud would make backup/recovery a lot easier. Checkout our doc: https://github.com/Canner/WrenAI/blob/main/wren-ui/README.md to learn how to switch database by changing env variables.

wwwy3y3 avatar Mar 25 '25 07:03 wwwy3y3

@wwwy3y3, For now we have created 15 more threads only, We are using sqlite3 database. Now working fine. If we are getting any error we will share the logs.

UMANET123 avatar Mar 25 '25 09:03 UMANET123

@wwwy3y3, This is our currently configured docker-compose.yml file. We are working on Wren-UI changes with the Docker deployment. Kindly review it and suggest if any improvements are needed in the docker-compose.yml file.

version: "3"

volumes: data:

networks: wren: driver: bridge

services: bootstrap: image: ghcr.io/canner/wren-bootstrap:${WREN_BOOTSTRAP_VERSION} restart: on-failure platform: ${PLATFORM} environment: DATA_PATH: /app/data volumes: - data:/app/data command: /bin/sh /app/init.sh

wren-engine: image: ghcr.io/canner/wren-engine:${WREN_ENGINE_VERSION} restart: on-failure platform: ${PLATFORM} expose: - ${WREN_ENGINE_PORT} - ${WREN_ENGINE_SQL_PORT} volumes: - data:/usr/src/app/etc - ${PROJECT_DIR}/data:/usr/src/app/data networks: - wren depends_on: - bootstrap

ibis-server: image: ghcr.io/canner/wren-engine-ibis:${IBIS_SERVER_VERSION} restart: on-failure platform: ${PLATFORM} expose: - ${IBIS_SERVER_PORT} environment: WREN_ENGINE_ENDPOINT: http://wren-engine:${WREN_ENGINE_PORT} networks: - wren

wren-ai-service: image: ghcr.io/canner/wren-ai-service:${WREN_AI_SERVICE_VERSION} restart: on-failure platform: ${PLATFORM} expose: - ${WREN_AI_SERVICE_PORT} ports: - ${AI_SERVICE_FORWARD_PORT}:${WREN_AI_SERVICE_PORT} environment: # sometimes the console won't show print messages, # using PYTHONUNBUFFERED: 1 can fix this PYTHONUNBUFFERED: 1 CONFIG_PATH: /app/data/config.yaml QDRANT_URL: "http://qdrant:6333" env_file: - ${PROJECT_DIR}/.env volumes: - ${PROJECT_DIR}/config.yaml:/app/data/config.yaml networks: - wren depends_on: - qdrant

qdrant: image: qdrant/qdrant:v1.11.0 restart: on-failure ports: - "6333:6333" - "6334:6334" volumes: - data:/qdrant/storage networks: - wren

wren-ui: build: context: /root/karya/kdapt/wren-ui/ # This ensures it uses your local folder to build dockerfile: Dockerfile # Use the existing Dockerfile in wren-ui directory #image: ghcr.io/canner/wren-ui:${WREN_UI_VERSION} restart: on-failure platform: ${PLATFORM} environment: DB_TYPE: sqlite # /app is the working directory in the container SQLITE_FILE: /app/data/db.sqlite3 WREN_ENGINE_ENDPOINT: http://wren-engine:${WREN_ENGINE_PORT} WREN_AI_ENDPOINT: http://wren-ai-service:${WREN_AI_SERVICE_PORT} IBIS_SERVER_ENDPOINT: http://ibis-server:${IBIS_SERVER_PORT} # this is for telemetry to know the model, i think ai-service might be able to provide a endpoint to get the information GENERATION_MODEL: ${GENERATION_MODEL} # telemetry WREN_ENGINE_PORT: ${WREN_ENGINE_PORT} WREN_AI_SERVICE_VERSION: ${WREN_AI_SERVICE_VERSION} WREN_UI_VERSION: ${WREN_UI_VERSION} WREN_ENGINE_VERSION: ${WREN_ENGINE_VERSION} USER_UUID: ${USER_UUID} POSTHOG_API_KEY: ${POSTHOG_API_KEY} POSTHOG_HOST: ${POSTHOG_HOST} TELEMETRY_ENABLED: ${TELEMETRY_ENABLED} # client side NEXT_PUBLIC_USER_UUID: ${USER_UUID} NEXT_PUBLIC_POSTHOG_API_KEY: ${POSTHOG_API_KEY} NEXT_PUBLIC_POSTHOG_HOST: ${POSTHOG_HOST} NEXT_PUBLIC_TELEMETRY_ENABLED: ${TELEMETRY_ENABLED} EXPERIMENTAL_ENGINE_RUST_VERSION: ${EXPERIMENTAL_ENGINE_RUST_VERSION} # configs WREN_PRODUCT_VERSION: ${WREN_PRODUCT_VERSION} ports: # HOST_PORT is the port you want to expose to the host machine - ${HOST_PORT}:3000 volumes: - data:/app/data

networks:
  - wren
depends_on:
  - wren-ai-service
  - wren-engine

UMANET123 avatar Mar 26 '25 02:03 UMANET123