sidekiq-unique-jobs
sidekiq-unique-jobs copied to clipboard
Run jobs in sequence (run 1 job at a time)
Is there a way to run 1 job at a time no matter what are the job's arguments?
Here is the use case.
I'd like to create multiple invoices via external API. They must be created in sequence to avoid duplicated invoice numbers.
The worker class creates 1 invoice based on arguments passed to perform
method.
Current problem: If many workers run concurrently then there is a risk that concurrent requests to external API are going to create duplicated invoice numbers. In order to avoid that I'd like to run jobs in sequence.
Possible solution In order to run 1 job at a time (in sequence) I tried something like:
class MyWorker
include Sidekiq::Worker
sidekiq_options lock: :while_executing,
lock_args_method: ->(args) { ['the-same-key-for-all-jobs'] },
lock_timeout: nil,
log_duplicate_payload: true,
on_conflict: :reschedule
def perform(amount)
# create an invoice via external API
end
end
But I'm getting this error: https://github.com/mhenrixon/sidekiq-unique-jobs/issues/686
Expected behavior:
Each job can have different arguments passed to perform
method.
Each job should be executed.
No job should be skipped.
The job should be run in sequence. It's fine if they are rescheduled and put back to Redis queue, or they are run a bit later.
Do you think it's possible to use sidekiq-unique-jobs for this use case?
Or is it better to spin off a separate sidekiq process to run 1 thread and this way costume a queue in sequence? The worker class could assign job to a sequential
queue name. Then the separate sidekiq process could consume the sequential
queue name. The downside is that we need to spin off a separate sidekiq process and this is an extra cost.
Thanks
Should be fixed by #682
Sorry @ArturT it seems I didn't read your request properly. Enforcing sequential isn't possible yet but I have some ideas on this.
It seems in the case of a crash, it would be beneficial to have a sequential queue where in case of conflict a job could be put to avoid it being processed at the same time as a duplicate.
The alternative would be that I write my own sidekiq fetcher that ensures no duplicates are picked from the queue and gain better control over the locking mechanism.
I don't have anything more than some rambling thoughts.
I used to have this idea. To achieve sequential jobs I could use semaphore ( https://github.com/dv/redis-semaphore ) so that if 2 jobs are running at the same time only one has mutex and can process work (do request to external invoicing API). The 2nd job needs to wait for the 1st one to be completed (so that 1st job releases mutex). But this has a downside. If the invoicing API has a timeout then a lot of jobs in sidekiq could become busy and waiting for the 1st job stuck in timeout. We could saturate sidekiq threads easily.
Maybe a better approach would be to allow start processing 1st job and if 2nd job is going to start then it's rescheduled in 10seconds or so. The downside would be polluting the sidekiq stats that would show a lot of rescheduled/retried jobs. I'm not sure if there is a better way.
As for now, I managed to report the problem to the invoicing API provider and they are going to fix the issue on their side to prevent duplicated invoice numbers when processing 2 jobs in sidekiq at the same time on our side. But I guess such a sequential queue feature would be still useful. I recall in past wondering a few times how to do sequential jobs but didn't figure it out. The sequential queue would be a cool feature if more users find it useful.
Cheers!
@ArturT We use Sidekiq::Throttled to enforce concurrency. Not sure if that is sufficient for you use case, as it will not enforce the sequence of jobs, but will ensure only one job happens at a time & will not lock up workers.
Thanks @holstvoogd It looks like Sidekiq::Throttled
could enforce one job at a time.
@mhenrixon We also need unique jobs sequentially and already using this gem for other use cases. I just wanted to check in and see if you had given any more thought on how to achieve sequential unique jobs? We tried Sidekiq::Throttled
but it doesn't seem to maintain the correct order
@mhenrixon We also need unique jobs sequentially and already using this gem for other use cases. I just wanted to check in and see if you had given any more thought to how to achieve sequential unique jobs? We tried
Sidekiq::Throttled
but it doesn't seem to maintain the correct order
@KevinKelly25, the suggestion that @ArturT originally tried is the closest we are going to get without writing a separate sidekiq queue.
The problem is that sidekiq is built for parallelism and not at all for sequential order specific processing.
Sidekiq isn't a queue system, it is a tool to spread order independent work across multiple threads/processes.
It isn't trivial to solve this in a sane way using redis.
For my day job I am currently working on a websocket integration where I am only allowed one thread for multiple tokens.
I have been doing some thinking but still no bright and disruptive ideas.
All suggestions welcome.
Coincidentally, we also ran into ordering issues with Throttled this week. As it works by popping a job from the queue and requeuing it if it should be throttled. This can lead to out-of-order execution too.
But fundamentally, sidekiq cannot granatuee ordering anyway. Even with one thread it can't because retries add a jitter to the delay.
We are going to enforce order in the 'application layer' lets say by checking timestamps or a sequence ID or something.