joss-reviews
joss-reviews copied to clipboard
[REVIEW]: MIRP: A Python package for standardised radiomics
Submitting author: @alexzwanenburg (Alex Zwanenburg) Repository: https://github.com/oncoray/mirp Branch with paper.md (empty if default branch): paper Version: 2.2.4 Editor: @emdupre Reviewers: @surajpaib, @Matthew-Jennings, @drcandacemakedamoore, @theanega Archive: 10.5281/zenodo.12493595
Status
Status badge code:
HTML: <a href="https://joss.theoj.org/papers/165c85b1ecad891550a21b12c8b2e577"><img src="https://joss.theoj.org/papers/165c85b1ecad891550a21b12c8b2e577/status.svg"></a>
Markdown: [](https://joss.theoj.org/papers/165c85b1ecad891550a21b12c8b2e577)
Reviewers and authors:
Please avoid lengthy details of difficulties in the review thread. Instead, please create a new issue in the target repository and link to those issues (especially acceptance-blockers) by leaving comments in the review thread below. (For completists: if the target issue tracker is also on GitHub, linking the review thread in the issue or vice versa will create corresponding breadcrumb trails in the link target.)
Reviewer instructions & questions
@surajpaib & @Matthew-Jennings & @drcandacemakedamoore & @theanega, your review will be checklist based. Each of you will have a separate checklist that you should update when carrying out your review. First of all you need to run this command in a separate comment to create the checklist:
@editorialbot generate my checklist
The reviewer guidelines are available here: https://joss.readthedocs.io/en/latest/reviewer_guidelines.html. Any questions/concerns please let @emdupre know.
β¨ Please start on your review when you are able, and be sure to complete your review in the next six weeks, at the very latest β¨
Checklists
π Checklist for @Matthew-Jennings
Hello humans, I'm @editorialbot, a robot that can help you with some common editorial tasks.
For a list of things I can do to help you, just type:
@editorialbot commands
For example, to regenerate the paper pdf after making changes in the paper's md or bib files, type:
@editorialbot generate pdf
Reference check summary (note 'MISSING' DOIs are suggestions that need verification):
OK DOIs
- 10.1038/s41571-022-00707-0 is OK
- 10.1038/nrclinonc.2016.162 is OK
- 10.1148/radiol.2020191145 is OK
- 10.1038/s41598-017-13448-3 is OK
- 10.1038/s41598-018-36938-4 is OK
- 10.1038/s41598-022-13967-8 is OK
- 10.1148/radiol.211604 is OK
- 10.1148/radiol.231319 is OK
- 10.1038/nrclinonc.2017.141 is OK
- 10.1038/s41467-023-44591-3 is OK
MISSING DOIs
- None
INVALID DOIs
- None
Software report:
github.com/AlDanial/cloc v 1.88 T=1.70 s (105.6 files/s, 31142.2 lines/s)
-------------------------------------------------------------------------------
Language files blank comment code
-------------------------------------------------------------------------------
Python 104 5692 4648 19852
HTML 25 1334 75 7703
Markdown 12 1193 0 5535
SVG 2 1 1 2996
JavaScript 12 131 221 880
CSS 4 190 35 779
reStructuredText 7 169 159 351
XML 4 0 336 256
TeX 1 19 0 236
R 1 25 8 77
YAML 3 6 4 57
TOML 1 5 0 47
DOS Batch 2 8 1 28
make 1 4 7 9
Bourne Shell 1 0 0 1
-------------------------------------------------------------------------------
SUM: 180 8777 5495 38807
-------------------------------------------------------------------------------
gitinspector failed to run statistical information for the repository
Wordcount for paper.md
is 1025
:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:
π Hi @surajpaib, @Matthew-Jennings, @drcandacemakedamoore, @theanega, and thank you again for agreeing to review this submission for MIRP !
The review will take place in this issue, and you can generate your individual reviewer checklists by asking editorialbot directly with @editorialbot generate my checklist
.
In working through the checklist, you're likely to have specific feedback on MIRP. Whenever possible, please open relevant issues on the software repository (and cross-link them with this issue) rather than discussing them here. This helps to make sure that feedback is translated into actionable items to improve the software !
If you aren't sure how to get started, please see the Reviewing for JOSS guide -- and, of course, feel free to ping me with any questions !
Review checklist for @surajpaib
Conflict of interest
- [x] I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.
Code of Conduct
- [x] I confirm that I read and will adhere to the JOSS code of conduct.
General checks
- [x] Repository: Is the source code for this software available at the https://github.com/oncoray/mirp?
- [x] License: Does the repository contain a plain-text LICENSE or COPYING file with the contents of an OSI approved software license?
- [x] Contribution and authorship: Has the submitting author (@alexzwanenburg) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?
- [x] Substantial scholarly effort: Does this submission meet the scope eligibility described in the JOSS guidelines
- [x] Data sharing: If the paper contains original data, data are accessible to the reviewers. If the paper contains no original data, please check this item.
- [x] Reproducibility: If the paper contains original results, results are entirely reproducible by reviewers. If the paper contains no original results, please check this item.
- [x] Human and animal research: If the paper contains original data research on humans subjects or animals, does it comply with JOSS's human participants research policy and/or animal research policy? If the paper contains no such data, please check this item.
Functionality
- [x] Installation: Does installation proceed as outlined in the documentation?
- [x] Functionality: Have the functional claims of the software been confirmed?
- [x] Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)
Documentation
- [x] A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
- [x] Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
- [x] Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
- [x] Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
- [x] Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
- [x] Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support
Software paper
- [x] Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
- [x] A statement of need: Does the paper have a section titled 'Statement of need' that clearly states what problems the software is designed to solve, who the target audience is, and its relation to other work?
- [x] State of the field: Do the authors describe how this software compares to other commonly-used packages?
- [x] Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
- [x] References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?
Review checklist for @drcandacemakedamoore
Conflict of interest
- [x] I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.
Code of Conduct
- [x] I confirm that I read and will adhere to the JOSS code of conduct.
General checks
- [x] Repository: Is the source code for this software available at the https://github.com/oncoray/mirp?
- [x] License: Does the repository contain a plain-text LICENSE or COPYING file with the contents of an OSI approved software license?
- [x] Contribution and authorship: Has the submitting author (@alexzwanenburg) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?
- [x] Substantial scholarly effort: Does this submission meet the scope eligibility described in the JOSS guidelines
- [x] Data sharing: If the paper contains original data, data are accessible to the reviewers. If the paper contains no original data, please check this item.
- [x] Reproducibility: If the paper contains original results, results are entirely reproducible by reviewers. If the paper contains no original results, please check this item.
- [x] Human and animal research: If the paper contains original data research on humans subjects or animals, does it comply with JOSS's human participants research policy and/or animal research policy? If the paper contains no such data, please check this item.
Functionality
- [x] Installation: Does installation proceed as outlined in the documentation?
- [ ] Functionality: Have the functional claims of the software been confirmed?
- [x] Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)
Documentation
- [ ] A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
- [x] Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
- [ ] Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
- [ ] Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
- [x] Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
- [ ] Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support
Software paper
- [x] Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
- [x] A statement of need: Does the paper have a section titled 'Statement of need' that clearly states what problems the software is designed to solve, who the target audience is, and its relation to other work?
- [x] State of the field: Do the authors describe how this software compares to other commonly-used packages?
- [x] Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
- [x] References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?
Review checklist for @Matthew-Jennings
Conflict of interest
- [x] I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.
Code of Conduct
- [x] I confirm that I read and will adhere to the JOSS code of conduct.
General checks
- [x] Repository: Is the source code for this software available at the https://github.com/oncoray/mirp?
- [x] License: Does the repository contain a plain-text LICENSE or COPYING file with the contents of an OSI approved software license?
- [x] Contribution and authorship: Has the submitting author (@alexzwanenburg) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?
- [x] Substantial scholarly effort: Does this submission meet the scope eligibility described in the JOSS guidelines
- [x] Data sharing: If the paper contains original data, data are accessible to the reviewers. If the paper contains no original data, please check this item.
- [x] Reproducibility: If the paper contains original results, results are entirely reproducible by reviewers. If the paper contains no original results, please check this item.
- [x] Human and animal research: If the paper contains original data research on humans subjects or animals, does it comply with JOSS's human participants research policy and/or animal research policy? If the paper contains no such data, please check this item.
Functionality
- [x] Installation: Does installation proceed as outlined in the documentation?
- [ ] Functionality: Have the functional claims of the software been confirmed?
- [ ] Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)
Documentation
- [x] A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
- [x] Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
- [ ] Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
- [ ] Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
- [x] Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
- [x] Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support
Software paper
- [x] Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
- [x] A statement of need: Does the paper have a section titled 'Statement of need' that clearly states what problems the software is designed to solve, who the target audience is, and its relation to other work?
- [x] State of the field: Do the authors describe how this software compares to other commonly-used packages?
- [x] Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
- [x] References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?
Review checklist for @theanega
Conflict of interest
- [x] I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.
Code of Conduct
- [x] I confirm that I read and will adhere to the JOSS code of conduct.
General checks
- [x] Repository: Is the source code for this software available at the https://github.com/oncoray/mirp?
- [x] License: Does the repository contain a plain-text LICENSE or COPYING file with the contents of an OSI approved software license?
- [x] Contribution and authorship: Has the submitting author (@alexzwanenburg) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?
- [x] Substantial scholarly effort: Does this submission meet the scope eligibility described in the JOSS guidelines
- [x] Data sharing: If the paper contains original data, data are accessible to the reviewers. If the paper contains no original data, please check this item.
- [x] Reproducibility: If the paper contains original results, results are entirely reproducible by reviewers. If the paper contains no original results, please check this item.
- [x] Human and animal research: If the paper contains original data research on humans subjects or animals, does it comply with JOSS's human participants research policy and/or animal research policy? If the paper contains no such data, please check this item.
Functionality
- [x] Installation: Does installation proceed as outlined in the documentation?
- [x] Functionality: Have the functional claims of the software been confirmed?
- [x] Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)
Documentation
- [x] A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
- [x] Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
- [x] Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
- [x] Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
- [x] Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
- [x] Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support
Software paper
- [x] Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
- [x] A statement of need: Does the paper have a section titled 'Statement of need' that clearly states what problems the software is designed to solve, who the target audience is, and its relation to other work?
- [x] State of the field: Do the authors describe how this software compares to other commonly-used packages?
- [x] Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
- [x] References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?
@editorialbot generate pdf
:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:
@alexzwanenburg Can you please clarify if Sebastian Starke and Steffan Lock are the same person. I do not see any code contributions from a Steffan Lock. This may be fine if he helped make the package, but please clarify. Or maybe Steffan Leger as mentioned in the thank you at the bottom is the same as Steffan Lock?
@drcandacemakedamoore Steffen LΓΆck is my professor and advised on the paper and the package. Stefan Leger contributed to early in-house versions of MIRP, prior to moving GitHub. Sebastian Starke also made minor contributions to an earlier version of MIRP.
@alexzwanenburg preferably module names should be all lower case (and super-best is single word or if no other choice with underscore). I notice you have modules that are camelcase. Different file systems have different case conventions. The real point is that it could be possible to import these modules in two different ways then cause problems on different files systems where case conventions are different. I will get to more substantial issues soon, but this already pops out for my eyes, before I even started the real review. But since I'm on superficial issues right now, some badges would not hurt (it's nice to have the pypi version badge and also Anaconda if you released it there, and I can't tell at first glance here.)
@alexzwanenburg on a less superficial issue, I note there is no developer's documentation. Many people may want to tinker with what you have done, and hopefully even contribute to the package. I am looking for documentation somewhere that tells people how to run the testing, so they can test new stuff before sending it. I am also looking for this, because it is not clear if you have any automated testing that runs in CI (did I miss it?) so instead of figuring out how to run it from there I would need instructions to run your tests properly . Update: I see from my Windows machine it is python -m pytest
but I have no idea what it will be from a Mac
Thank you, @drcandacemakedamoore !
If you could please open subsequent review comments as issues on the MIRP repository, this will help to make sure there is sufficient space for follow-up discussion and that action items are trackable across reviewers. I know that @alexzwanenburg has started to respond in-thread, but we'll generally ask to keep only high-level discussions in the general review thread and re-direct all other comments to the project issue tracker.
If you have any other questions, of course, please don't hesitate to ask.
π Hi everyone, happy Monday !
I just wanted to check-in on the status of this review and make sure that there weren't any current blockers in working through the reviewer checklists.
I did notice that @drcandacemakedamoore has opened https://github.com/oncoray/mirp/issues/66, https://github.com/oncoray/mirp/issues/67, and https://github.com/oncoray/mirp/issues/68 -- thank you ! I'm cross-linking them here, so they're easier for myself (and other reviewers) to track.
Hello! Thank you very much for the invitation to review this paper. I will finalize the review later this week (only the "functionality" section is missing). For now, I've left my comments on issue #69 .
@emdupre Feel free to let me know if my comments are appropriate, this is my first time reviewing for JOSS so I'm learning, thanks!
π Hi everyone ! Thank you for your comments on MIRP to date !
I just wanted to note that we have now passed the four week review window. If you could please work on finalizing your initial reviews as soon as possible, I would appreciate it.
Once you have finalized your initial reviews, you can let me know by responding directly in this thread. Of course, if you have any questions or blockers, please don't hesitate to let me know !
π @surajpaib @Matthew-Jennings @drcandacemakedamoore @theanega
π Hello again,
I just wanted to follow up on the previous message as I know if created some concern and confusion :
- I should have noted that the "initial review" refers to each reviewer working through their checklist and opening up associated issues. I did not mean to imply that those issues should have also been resolved !
- The "four week review window" was in reference to our original request to finish reviews within four weeks if possible or six weeks at latest. We have just passed five weeks on this review ; however, if you require additional time beyond the six week window, please let me know. I will try to accommodate specific timelines as possible.
Apologies for being unclear on these points in my previous message. If there's anything else I can clarify, please let me know. And thank you again for your work in reviewing MIRP !
Thanks, @emdupre! No worries.
@alexzwanenburg can you confirm the most up to date branch we should be looking at is dev2.2.1 for everything?
@drcandacemakedamoore I can confirm that the most up-to-date branch is dev2.2.1. I have been working on this branch to address your comments and suggestions. This does not include the paper itself, which lives in the paper branch.
Hi @emdupre! I will cross off the remainder of the checklist by the end of this week. I hope that timeline works.
π Hi everyone, thanks for the updates !
I just wanted to summarize status:
@theanega : I see that you've completed your checklist, but you've also created issue https://github.com/oncoray/mirp/issues/69 which is still open. Could you please confirm the status of your review ?
@Matthew-Jennings : I see that you've not yet completed your checklist and that you've created the following open issues https://github.com/oncoray/mirp/issues/72, https://github.com/oncoray/mirp/issues/73, and https://github.com/oncoray/mirp/issues/75. I assume you've currently finished the initial review and are now waiting on the resolution of these issues, but if you're instead still working on the initial review,please let me know.
@drcandacemakedamoore and @surajpaib : I know that you were not expecting to finish the initial review until this week. Please let us know when you are able to do so !
I'll also cross-link all of the other associated issues to date, in case this helps in finalizing initial reviews:
- https://github.com/oncoray/mirp/issues/54
- https://github.com/oncoray/mirp/issues/55
- https://github.com/oncoray/mirp/issues/62
- https://github.com/oncoray/mirp/issues/63
- https://github.com/oncoray/mirp/issues/66
- https://github.com/oncoray/mirp/issues/67
- https://github.com/oncoray/mirp/issues/68
- https://github.com/oncoray/mirp/issues/76
@emdupre: Yep, that's correct!
I released version 2.2.1. This also includes updates to the documentation and a new tutorial: https://oncoray.github.io/mirp/tutorial_compute_radiomics_features_mr.html
I am looking forward to your feedback on these updates.
Hi @emdupre I've opened an issue on the MIRP repo with more of my observations and with that I'm done with my initial review.
Much thanks for your patience