Fix: Quote shell parameters
https://github.com/nf-core/differentialabundance/pull/444
I ran a differential abundance pipeline with a study name that had spaces in it, and the pipeline crashed because the make_app_from_files.R caller script was not escaping parameters as expected. I believe the patch included in this PR is pretty easy to follow: The parameters were expanded by the $ and the second word of the study name was interpreted as another argument.
Making this change made the pipeline succeed.
If you believe this contribution is of your interest but you require of me to add tests I can try to find time to learn how to write tests and comply with the rest of the checklist. It may take me a while though to find the time, and the patch is really minor so I would appreciate your consideration.
You may want to consider security ramifications of being able to pass a study_name like ; arbitrary_command and inject code that would be executed by the pipeline. I am not familiar enough with nextflow to understand whether that should be considered a vulnerability risk or not, but if you think this is an issue, then this PR would fix it.
In any case, thanks for your time and for all the work you do.
PR checklist
Closes #XXX
- [x] This comment contains a description of changes (with reason).
- [ ] If you've fixed a bug or added code that should be tested, add tests!
- [ ] If you've added a new tool - have you followed the module conventions in the contribution docs
- [ ] If necessary, include test data in your PR.
- [ ] Remove all TODO statements.
- [ ] Emit the
versions.ymlfile. - [ ] Follow the naming conventions.
- [ ] Follow the parameters requirements.
- [ ] Follow the input/output options guidelines.
- [ ] Add a resource
label - [ ] Use BioConda and BioContainers if possible to fulfil software requirements.
- Ensure that the test works with either Docker / Singularity. Conda CI tests can be quite flaky:
- For modules:
- [ ]
nf-core modules test <MODULE> --profile docker - [ ]
nf-core modules test <MODULE> --profile singularity - [ ]
nf-core modules test <MODULE> --profile conda
- [ ]
- For subworkflows:
- [ ]
nf-core subworkflows test <SUBWORKFLOW> --profile docker - [ ]
nf-core subworkflows test <SUBWORKFLOW> --profile singularity - [ ]
nf-core subworkflows test <SUBWORKFLOW> --profile conda
- [ ]
- For modules:
maybe we should add one test where we cover this situation, just to avoid running into the same thing in the future.
Sorry I have been out of the office for the last few weeks.
Instead of having to write tests for these kind of small issues I believe it would be better to be able to make the CI run shellcheck on the shell scripts that are part of the nextflow files, so these issues are automatically reported. I would find it a challenge to extract all the shell scripts from the nf files though, although nextflow is already doing that so it should be feasible to implement.
I would have to set up the development environment and learn to write tests. I can do that, but it may take me a long while (another month?) to find the time. Is the test case necessary?
Instead of a test case for this script I provided support for using shellcheck on all the script blocks in the repository. shellcheck would have picked this issue.
- https://github.com/nf-core/modules/pull/8433