django-celery
django-celery copied to clipboard
Celery with rabbitmq and django queue not being process unless we double the workers
Celery 3.0.23 django-celery 3.0.11 Django 1.6.5 Python 2.7.6 RabbitMQ 3.6.6
got an issue using celery 3.0.23 with django and rabbitmq, i got 2 servers feeding the rabbitmq Q and 1 celery server with 4 workers, Previously this setup was working fine using Redis, Now that we started using rabbit, with 1 server, the workers were idle, CPU was down and the Q wasn´t being worked, i doble the workers and this seamed to solve the issue (funnt thing, increasing from 4 to 6 didn´t change the problem at all and CPU was down), Today i connected the second server and had the same problem with 8 workers, i started increasing to 10, 12.... until i hit 16 and at that point the Q started to be processed. is this a known issue in celery 3.0.23? do we need to set things diferently?
sorry for the delay, turns out that for some reason the amount of workers in celery needs to match the sum of workers in all apps connected to rabbit. any ideas?
could you clarify your second comment? What do you mean?
We too have encountered issues where celery workers seem to just stall out. No jobs being processed even though the queue grows very large. We eventually theorized that "ETA" jobs were blocking the workers and found that when we set the prefetch much larger (from 4 ==> 128 ) the workers were able to resume working. Sorry I cant give a better report, but it was very black box to us and trial and error led to the queues just working again, so we stopped investigating.