sidekiq-superworker
sidekiq-superworker copied to clipboard
Support for dynamic number of parallel jobs?
What if I have a workflow that looks like this?
- Fetch the number of objects
- For each object, enqueue one processing job
- After all objects have been processed, run one final job
Is this something that's supported? A practical example is processing all of a user's Github repos. You don't know how many there are in advance, and you want to be able to let individual jobs fail independently.
It doesn't seem like defining superworkers is the right pattern.
@joshjordan I'm trying to do this right now. Do batches not handle this for you?
Perhaps it does! Is it possible to create a job that runs only after the entire batch has completed? That's what I couldn't figure out or make happen when I tried.