django-celery-results
django-celery-results copied to clipboard
Expiration of results - tasks not cleaned
Hello,
- django = "^4.1.7"
- celery = "^5.2.7"
- redis = "^4.5.1"
- django-celery-results = "^2.4.0"
- django-celery-beat = "^2.4.0"
Sorry, a few posts already about this, but can not make it work. I made a test project with the following settings (it is dumb, but I just want to try out celery)
settings.py:
CELERY_BROKER_URL = "redis://localhost:6379"
CELERY_RESULT_BACKEND = 'django-db'
CELERY_RESULT_EXTENDED = True
CELERY_RESULT_EXPIRES = 10
tasks.py:
@shared_task()
def list_users():
return list(User.objects.values())
Then launched through a submission of form with
list_users.delay()
Three terminals running:
- ./manage.py runserver
- celery -A django_celery worker -l info
- celery -A django_celery beat -l INFO
First, I am a bit confused on the variable name to use. Some say to use CELERY_RESULT_EXPIRES some say CELERY_TASK_RESULT_EXPIRES. Anyway, none works.
Task results are well created, but never cleaned (well, at least not in the 10 seconds I set up).
Anything I missed?
JFYI:
- Default task result will expire in one day and if we want to change that we need override the
CELERY_TASK_RESULT_EXPIRES
default time is in seconds. The task runs daily at 4am. - Default task clean up which delete the expired task from the task result table we'll run default at 4am(UTC) if beat is setup.
You can add this settings
# your settings
CELERY_BEAT_SCHEDULER='django_celery_beat.schedulers:DatabaseScheduler'
CELERY_TASK_RESULT_EXPIRES=10
And for testing purpose that actually clean up job is running or not you can set below settings
celery_app.conf.beat_schedule = {
'backend_cleanup': {
'task': 'celery.backend_cleanup',
'schedule': crontab(minute='15', hour='11'), # this time is in UTC
'options': {'expires': 10},
}
}