django-q2
django-q2 copied to clipboard
Prevent concurrent executions of same task
Hi, I've setup a django-q2 cluster and I have a task that runs every 10 minutes. Usually it takes only 2-3 minutes to complete but in some circumnstances it could take more than 10 minutes. Is there any way to prevent the task from starting if the previous instance is still running? Here is my configuration:
Q_CLUSTER = {
'name': 'Mondrian',
'workers': 8,
'recycle': 500,
#'max_attempts': 10,
'timeout': 600, # in seconds (MUST be < retry)
'retry': 3600, # in seconds (3600 secs = 1 hour)
'compress': True,
'save_limit': 500, # number of successful tasks saved to Django (failures are always saved)
'queue_limit': 16,
'cpu_affinity': 1,
'label': 'Task Scheduler',
'catch_up': False,
'scheduler': True,
'redis': 'redis://127.0.0.1:6379',
#'time_zone': 'Europe/Rome',
}
Thanks!
Could you set the workers to 1? If you have more jobs that the cluster should do, then I think your best bet would be to setup a second cluster with one worker, which only processes that job. That way, they would always finish before starting a new one: https://django-q2.readthedocs.io/en/master/cluster.html#multiple-queues
I have a similar situation. Would it be possible to implement a lock mechanism for a certain task, released when the results are finished? In my case a task will fire a multi-core long running slurm job (2-3hours, 24 cpu's). There is something similar described here:http://loose-bits.com/2010/10/distributed-task-locking-in-celery.html
One suggestion could be what @coleifer does with Huey: https://huey.readthedocs.io/en/latest/guide.html#locking-tasks