cognita
cognita copied to clipboard
dependency failed to start: container cognita-postgres is unhealthy
Dear Team,
it is to update you that I gave been trying to deploy the app locally using my windows system & docker desktop. The following has been the output in the terminal. Could you please suggest me as how can I resolve the issue here in.
I', trying to run the build service with - docker-compose --env-file compose.env up
cognita-postgres | fixing permissions on existing directory /var/lib/postgresql/data ... ok
cognita-postgres | creating subdirectories ... ok
cognita-postgres | selecting dynamic shared memory implementation ... posix
cognita-postgres | selecting default max_connections ... 20
cognita-postgres | selecting default shared_buffers ... 400kB
cognita-postgres | selecting default time zone ... Etc/UTC
cognita-postgres | creating configuration files ... ok
cognita-postgres | 2025-01-16 10:59:37.027 UTC [83] FATAL: data directory "/var/lib/postgresql/data" has invalid permissions
cognita-postgres | 2025-01-16 10:59:37.027 UTC [83] DETAIL: Permissions should be u=rwx (0700) or u=rwx,g=rx (0750).
cognita-postgres | child process exited with exit code 1
cognita-postgres | initdb: removing contents of data directory "/var/lib/postgresql/data"
cognita-postgres | running bootstrap script ...
cognita-postgres exited with code 1
Gracefully stopping... (press Ctrl+C again to force)
dependency failed to start: container cognita-postgres is unhealthy
PS C:\Users\AmitTiwari\PycharmProjects\cognita>
Here is the configuration at models_cofig.yaml, I have been trying to deploy with -
model_providers:
############################ Local ############################################
# Uncomment this provider if you want to use local models providers #
# using ollama and infinity model server #
###############################################################################
# - provider_name: local-ollama
# api_format: openai
# base_url: http://ollama-server:11434/v1/
# api_key_env_var: ""
# llm_model_ids:
# - "qwen2:1.5b"
# embedding_model_ids: []
# reranking_model_ids: []
# default_headers: {}
# - provider_name: local-infinity
# api_format: openai
# base_url: http://infinity-server:7997/
# api_key_env_var: INFINITY_API_KEY
# llm_model_ids: []
# embedding_model_ids:
# - "mixedbread-ai/mxbai-embed-large-v1"
# reranking_model_ids:
# - "mixedbread-ai/mxbai-rerank-xsmall-v1"
# default_headers: {}
# - provider_name: faster-whisper
# api_format: openai
# base_url: http://faster-whisper:8000
# api_key_env_var: ""
# llm_model_ids: []
# embedding_model_ids: []
# reranking_model_ids: []
# audio_model_ids:
# - "Systran/faster-distil-whisper-large-v3"
# default_headers: {}
############################ OpenAI ###########################################
# Uncomment this provider if you want to use OpenAI as a models provider #
# Remember to set `OPENAI_API_KEY` in container environment #
###############################################################################
- provider_name: openai
api_format: openai
api_key_env_var: OPENAI_API_KEY
llm_model_ids:
- "gpt-3.5-turbo"
- "gpt-4o"
embedding_model_ids:
- "text-embedding-3-small"
- "text-embedding-ada-002"
reranking_model_ids: []
default_headers: {}
############################ TrueFoundry ###########################################
# Uncomment this provider if you want to use TrueFoundry as a models provider #
# Remember to set `TFY_API_KEY` in container environment #
####################################################################################
# - provider_name: truefoundry
# api_format: openai
# base_url: https://llm-gateway.truefoundry.com/api/inference/openai
# api_key_env_var: TFY_API_KEY
# llm_model_ids:
# - "openai-main/gpt-4o-mini"
# - "openai-main/gpt-4-turbo"
# - "openai-main/gpt-3-5-turbo"
# embedding_model_ids:
# - "openai-main/text-embedding-3-small"
# - "openai-main/text-embedding-ada-002"
# reranking_model_ids: []
# default_headers: {}
Hello @Amitt1412 We use a standard Postgres image. A few options that you can try:
- Run postgres in isolation using
docker run --name my-postgres \ -e POSTGRES_USER=myuser \ -e POSTGRES_PASSWORD=mypassword \ -e POSTGRES_DB=mydatabase \ -p 5432:5432 \ -d postgres - Try removing volumes in docker-compose.yaml and then spin up the docker env.
Please let me know how it goes and I will try to help accordingly.
Hello @Amitt1412, please let me know if you are still facing issues. Also, please join our Slack channel and reach out to us if you want to get on a call and debug together.
@mnvsk97 hello i use mac m3 pro also have some problems,backend not startup,log like:
@mnvsk97 hello i use mac m3 pro also have some problems,backend not startup,log like:
how to reslove i t?I've already wasted a lot of time on this.