spinach
spinach copied to clipboard
Jobs without concurrency limits generate current_concurrency hash keys in redis
These should not really generate keys as there's no point. In addition, the counts associated with each are incremented forever so it's possible at some point an overflow could occur.
Maybe try this.
Modify the RedisBroker class in spinach to handle jobs without concurrency limits. Specifically, you can add a check in the _acquire_lock method to see if the job has a concurrency limit, and if not, skip generating the current_concurrency hash key and incrementing its count.
from spinach.brokers.redis import RedisBroker
class CustomRedisBroker(RedisBroker):
def _acquire_lock(self, job):
if job.concurrency_limit:
super()._acquire_lock(job)
else:
self._increment_concurrency_count(job, 0)
In this implementation, we create a new class CustomRedisBroker that inherits from RedisBroker. We override the _acquire_lock method to add the check for concurrency limits and skip generating the current_concurrency hash key if the job doesn't have a limit. Instead, we call the _increment_concurrency_count method with a count of 0 to avoid incrementing the count indefinitely.
This only appears to affect idempotent jobs. Non-idempotent jobs do not appear to set a value.
Steps to reproduce
Start a Redis, and run this:
import time
import spinach
sp = spinach.Engine(spinach.RedisBroker())
@sp.task(name='nap', max_retries=1)
def nap(job_id):
print(f"{job_id:3} Zzzz...")
time.sleep(5)
batch = spinach.Batch()
for i in range(32):
batch.schedule('nap', i)
sp.schedule_batch(batch)
sp.start_workers(32, stop_when_queue_empty=True)
On the Redis, run KEYS *
and/or HGETALL spinach/_current_concurrency
Expected result
No _current_concurrency
key existing, and/or its contents are nil.
Actual behavior
The nap
job has an entry in the table, despite having no maximum_concurrency
defined.
Fixed in 0.0.23