celery-haystack
celery-haystack copied to clipboard
Celery 4 support
The logic in task mostly moved to special handler class and the task became a minimal @shared_task instance.
Added proper exception handling.
Hello. Any idea when this might be merged in?
Still no merge?
I'm testing this branch right now and it appears to work with celery 4.x. There is one thing to be aware of though: If you are upgrading from a previous version of celery-haystack, and you were setting a custom CELERY_HAYSTACK_DEFAULT_TASK
, that task must now be a function instead of a class, as mentioned above by @ddemid.
You can still use your custom default task class, but you must set it as the handler now, using the setting CELERY_HAYSTACK_HANDLER
. In other words, in your settings just change DEFAULT_TASK
to HANDLER
and you should be mostly fine.
I would really like this pull request to be officially merged. I'm relying on other package which require celery 4.x, so using these in combination is very difficult. The change of settings variable names does make it a backwards incompatible change, though. If this wasn't changed it seems like an obvious merge.
Any progress on this?
Looks like this repo is dead :/
Any word on this?
I use this library. Haven't checked on the features though. It's fork of fork of this library. https://github.com/dwintergruen/celery-haystack/tree/celery4
@johnyoonh ooh, thanks for suggesting that fork!
I must admit that I'm personally fairly new to celery. That said, my team has been using celery successfully in our Django project, so I know at a basic level we have it set up correctly.
Unfortunately, I've tried the main fork and now the fork you've suggested and am still having the same issue in each of them.
Specifically, celery-haystack tasks are being sent to the worker but then they just sit there.
For example, if I create a new item for a haystack indexed model, the worker logs show:
studio-indexing-worker_1 | [2019-04-16 02:54:54,574: INFO/MainProcess] Received task: celery_haystack.tasks.haystack_signal_handler[870898d5-85b1-4bbf-ae84-206f4371250d] ETA:[2019-04-16 02:54:59.572244-07:53]
studio-indexing-worker_1 | [2019-04-16 02:54:54,574: DEBUG/MainProcess] basic.qos: prefetch_count->98
But then nothing happens!
When I do celery -A contentcuration inspect scheduled
I get loads and loads of accumulated tasks that look like this.
{u'eta': u'2019-04-16T02:54:59.574667-07:53',
u'priority': 6,
u'request': {u'acknowledged': False,
u'args': u"('update', 'contentcuration.contentnode.cee283f55ffc4322b4d21ad81b4567f3')",
u'delivery_info': {u'exchange': u'',
u'priority': 0,
u'redelivered': None,
u'routing_key': u'indexing'},
u'hostname': u'indexing-worker@1eb5d32b3752',
u'id': u'953cf225-9143-44b9-bf03-4cd61f8f7e0b',
u'kwargs': u'{}',
u'name': u'celery_haystack.tasks.haystack_signal_handler',
u'time_start': None,
u'type': u'celery_haystack.tasks.haystack_signal_handler',
u'worker_pid': None}
}
The worker is started like this: cd contentcuration/ && celery -A contentcuration worker -Q indexing -l debug -n indexing-worker@%h
I've tried using the default celery
, as well as the extra options --without-gossip --without-mingle --without-heartbeat -Ofair
Any thoughts? Am I missing something really basic about celery? I'm struggling to get a grip on how I can even further diagnose this.
@micahscopes Please don't hijack the issue and your question is more appropriate for stackoverflow. I'd check whether your broker is set up properly. I don't see if is using rabbitmq or redis, so I can't comment. Also remove queue option and start with default queue.
@johnyoonh apologies, I didn't intend to hijack the issue... it's unclear to me if the issue I'm facing is potentially related to something that changed with celery 4. As you suggested, I can try stackoverflow.
edit: I realize that this is a PR. My bad! I'll open a separate issue.
Hey again. I wanted to share back that my issue with Celery 4 support had to do with a bug in Celery 4.1.x where scheduled task times weren't correctly calculated for a given countdown and timezone. The bug was fixed in Celery 4.2.
If this PR ever gets merged, perhaps it'd be a good idea to either not support Celery 4.1.x or only support Celery >= 4.2.x.
@acdha can you take a look at this PR so we can get this moving forward?
@acdha can you take a look at this PR so we can get this moving forward?
I don't use this project myself so I'd like to get some feedback from anyone actually running it. Has anyone used it much?