django-celery icon indicating copy to clipboard operation
django-celery copied to clipboard

Celery workers shutdown and do not restart after reaching CELERYD_MAX_TASKS_PER_CHILD

Open johnhess opened this issue 8 years ago • 9 comments

We've been having an issue where celery workers shut down and do not restart after reaching CELERYD_MAX_TASKS_PER_CHILD.

In attempting to diagnose the issue, I've explored a few configurations:

  • Ubuntu, django db backend+postgres: fails
  • Ubuntu, rabbitmq: works fine
  • OS X, django db backend+sqlite: works fine

When workers die, the final log messages, when running with the DEBUG log level are:

[2016-06-15 12:00:26,105: INFO/MainProcess] Task study.models.parse_study[3c1850ab-090f-4332-adb9-4c8e9ab56293] succeeded in 1.2661089329994866s: None
[2016-06-15 12:00:29,938: DEBUG/MainProcess] Canceling task consumer...
[2016-06-15 12:00:30,857: DEBUG/MainProcess] Canceling task consumer...
[2016-06-15 12:00:30,858: DEBUG/MainProcess] Closing consumer channel...

Failing configuration:

 -------------- celery@app1 v3.1.23 (Cipater)
---- **** ----- 
--- * ***  * -- Linux-3.2.0-59-generic-x86_64-with-Ubuntu-12.04-precise
-- * - **** --- 
- ** ---------- [config]
- ** ---------- .> app:         project:0x7f285fc06f28
- ** ---------- .> transport:   django://localhost//  # NOTE: Django 1.8, PostgreSQL DB
- ** ---------- .> results:     disabled://
- *** --- * --- .> concurrency: 1 (prefork)
-- ******* ---- 
--- ***** ----- [queues]
 -------------- .> celery           exchange=celery(direct) key=celery

johnhess avatar Jun 15 '16 17:06 johnhess

are you using django as transport? what about increasing the max number?

auvipy avatar Jun 15 '16 19:06 auvipy

I am using django as the broker/message transport when it fails, yes. Increasing the max number increases the number of tasks that succeed correspondingly.

johnhess avatar Jun 15 '16 19:06 johnhess

using django as a broker is not a good idea

auvipy avatar Jun 15 '16 20:06 auvipy

I'm having this same issue with v3.1.23 and a redis broker. My configuration is as follows:

v3.1.23 (Cipater)

Linux-2.6.32-504.8.1.el6.x86_64-x86_64-with-redhat-6.6-Santiago

app: default:0x2f2dd50 (djcelery.loaders.DjangoLoader) concurrency: 9 (prefork)

ruebrenda avatar Sep 21 '16 01:09 ruebrenda

The same issue with Celery 3.1.25 + prefork + redis broker + djcelery + django db result backend. On Ubuntu 16.04

All child processes turned zombie (defunct) after finishing the tasks.

lingxiaoyang avatar Nov 25 '16 19:11 lingxiaoyang

The same issue with Celery 4.1.0, broker is redis.

BaiJiangJie avatar Dec 26 '19 11:12 BaiJiangJie

this is not recommended package with celery greater or equal 4.x.x

auvipy avatar Dec 26 '19 13:12 auvipy

@auvipy what's your recommendation about version?

stillwaterman avatar Mar 25 '20 07:03 stillwaterman

check the latest docs!

auvipy avatar Mar 25 '20 13:03 auvipy