nextflow
nextflow copied to clipboard
Solve queueSize exceeded when using job arrays
close #5920
- Modifying canSubmit to avoid to avoid exceed
executor.queueSizeparameter. - Printing warning when array size exceeds the queueSize. This can make the task array 'unsubmitable' (Not sure if we should abort the execution in this situation)
- Add unit tests
Tested with this pipeline
process test {
array 10
input:
val x
"""
echo $x
"""
}
workflow {
Channel.of(1..20) | test
}
With a config with awsbatch executor and different executor.queueSize values:
- size 15: Only one job submitted at a time in AWS Batch.
- size 5: Prints a warning but run continues without the task array jobs submitted.
Deploy Preview for nextflow-docs-staging ready!
| Name | Link |
|---|---|
| Latest commit | 0a557a531bbf40622ab89d9b37283a9b17c83e61 |
| Latest deploy log | https://app.netlify.com/projects/nextflow-docs-staging/deploys/6846d81fd7c9ae00082f239c |
| Deploy Preview | https://deploy-preview-6047--nextflow-docs-staging.netlify.app |
| Preview on mobile | Toggle QR Code...Use your smartphone camera to open QR code link. |
To edit notification comments on pull requests, go to your Netlify project configuration.
- change the warning to an error, since the run will hang anyway
- make process maxForks aware of job arrays
Because processor.forksCount is related to maxForks directive, instead here the aim is to controller the max numbers of task runs
So "forks" here just refers to the number of concurrent tasks, therefore I see no reason to call it something else in the task handler