django-celery-results icon indicating copy to clipboard operation
django-celery-results copied to clipboard

Backend database - celery_result_backend_db [Documentation]

Open murrayidau opened this issue 6 years ago • 6 comments

I had trouble getting tasks results to write to the backend database (psql). The problem was fixed using information contained here https://github.com/celery/django-celery-results/issues/19

Specifically, a full reference to the database was required: CELERY_RESULT_BACKEND_DB = 'postgresql+psycopg2://......'

...in addition to the settings described in the documentation: CELERY_RESULT_BACKEND = 'django-db'

Documentation should be updated to detail this setting and when it is required.

murrayidau avatar Jul 14 '19 04:07 murrayidau

would you mind sending a PR?

auvipy avatar Aug 14 '19 12:08 auvipy

Can you provide the complete string for the CELERY_RESULT_DB? Do I have to include the database name or database name AND table? Do I need to add anything to Celery.py? Results are not being saved to the database and I tried everything in issue 19.

Franchesca-O avatar Nov 13 '19 21:11 Franchesca-O

Yes, you need to provide a full reference to the database including name and credentials (in the below being retrieved from environment variables). Nothing additional to celery.py

CELERY_RESULT_BACKEND_DB = ''.join(['postgresql+psycopg2://', os.getenv("DATABASE_USER"), ":", os.getenv("DATABASE_PASSWORD"), "@localhost/", os.getenv("DATABASE_NAME")])

For completeness, the other Celery settings:

# CELERY SETTINGS CELERY_RESULT_BACKEND = 'django-db' CELERY_CACHE_BACKEND = 'django-cache' CELERY_BROKER_URL = 'redis://localhost:6379' CELERY_ACCEPT_CONTENT = ['application/json'] CELERY_TASK_SERIALIZER = 'json' CELERY_RESULT_SERIALIZER = 'json' CELERY_TIMEZONE = TIME_ZONE

murrayidau avatar Nov 16 '19 23:11 murrayidau

It is so hard to have the results written to the backend. django_celery_results doesn't log any error when the db isn't set properly.

I am struggling trying to understand what is going wrong

CELERY_RESULT_BACKEND = 'django-db'
CELERY_RESULT_BACKEND_DB = 'db+postgres://localhost:5432/mydbname'
CELERY_CACHE_BACKEND = 'django-cache'

superandrew avatar Jan 23 '20 23:01 superandrew

@auvipy I'm struggling to figure out where things are breaking down. Is this a documentation issue? I'm strictly configuring

CELERY_RESULT_BACKEND = 'django-db'
CELERY_CACHE_BACKEND = 'default'

And nothing else. I'm relying on CELERY_CACHE_BACKEND to use my CACHES setting

CACHES = {
    "default": {
        "BACKEND": "django_redis.cache.RedisCache",
        "LOCATION": env("DJANGO_REDIS_URL"),
        "OPTIONS": {
            "CLIENT_CLASS": "django_redis.client.DefaultClient",
            "PICKLE_VERSION": 5,  # version 5 added in python 3.8 https://docs.python.org/3/library/pickle.html#data-stream-format
            "COMPRESSOR": "django_redis.compressors.zstd.ZStdCompressor",
            "PARSER_CLASS": "redis.connection.HiredisParser",
            "SERIALIZER": "django_redis.serializers.msgpack.MSGPackSerializer",
            "SOCKET_CONNECT_TIMEOUT": 1,  # second
            "SOCKET_TIMEOUT": 1,  # second
            "CONNECTION_POOL_KWARGS": {
                "max_connections": 100,
                "retry_on_timeout": True
            }
        },
    }
}

With a DJANGO_REDIS_URL value of redis://localhost:6379/0, and yet my worker is still starting up looking for amqp://guest:**@localhost:5672//. I can't figure out why Celery isn't using the configured CACHES Redis database ¯\_(ツ)_/¯

wgordon17 avatar Dec 04 '21 16:12 wgordon17

This information is still missing in the docs and doesn't mention CELERY_RESULT_BACKEND_DB needs to be set if the results are to be saved to database.

BenoitGeslain avatar Jun 15 '22 08:06 BenoitGeslain