joss-reviews
joss-reviews copied to clipboard
[REVIEW]: SSN2: The next generation of spatial stream network modeling in R
Submitting author: @michaeldumelle (Michael Dumelle) Repository: https://github.com/USEPA/SSN2 Branch with paper.md (empty if default branch): joss Version: v0.1.1 Editor: @mikemahoney218 Reviewers: @fernandomayer, @k-doering-NOAA, @fawda123 Archive: Pending
Status
Status badge code:
HTML: <a href="https://joss.theoj.org/papers/66fd932526762f8ccd8bd9c3954e0e3d"><img src="https://joss.theoj.org/papers/66fd932526762f8ccd8bd9c3954e0e3d/status.svg"></a>
Markdown: [](https://joss.theoj.org/papers/66fd932526762f8ccd8bd9c3954e0e3d)
Reviewers and authors:
Please avoid lengthy details of difficulties in the review thread. Instead, please create a new issue in the target repository and link to those issues (especially acceptance-blockers) by leaving comments in the review thread below. (For completists: if the target issue tracker is also on GitHub, linking the review thread in the issue or vice versa will create corresponding breadcrumb trails in the link target.)
Reviewer instructions & questions
@fernandomayer & @k-doering-NOAA, your review will be checklist based. Each of you will have a separate checklist that you should update when carrying out your review. First of all you need to run this command in a separate comment to create the checklist:
@editorialbot generate my checklist
The reviewer guidelines are available here: https://joss.readthedocs.io/en/latest/reviewer_guidelines.html. Any questions/concerns please let @mikemahoney218 know.
✨ Please start on your review when you are able, and be sure to complete your review in the next six weeks, at the very latest ✨
Checklists
Hello humans, I'm @editorialbot, a robot that can help you with some common editorial tasks.
For a list of things I can do to help you, just type:
@editorialbot commands
For example, to regenerate the paper pdf after making changes in the paper's md or bib files, type:
@editorialbot generate pdf
Software report:
github.com/AlDanial/cloc v 1.88 T=0.23 s (888.3 files/s, 208289.1 lines/s)
-------------------------------------------------------------------------------
Language files blank comment code
-------------------------------------------------------------------------------
HTML 65 2240 198 14898
R 117 1673 5470 9003
JavaScript 4 2099 1928 7019
Markdown 6 223 0 929
TeX 2 94 0 782
XML 2 0 2 639
Rmd 2 171 561 216
C 2 12 22 45
YAML 3 12 2 44
SVG 1 0 1 11
CSS 1 0 5 1
JSON 1 0 0 1
-------------------------------------------------------------------------------
SUM: 206 6524 8189 33588
-------------------------------------------------------------------------------
gitinspector failed to run statistical information for the repository
Wordcount for paper.md
is 1982
:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:
👋🏼 @michaeldumelle, @fernandomayer, @k-doering-NOAA, @fawda123: this is the review thread for the paper. Just about all of our communications will happen here from now on. :smile:
As a reviewer, the first step is to create a checklist for your review by entering
@editorialbot generate my checklist
as the top of a new comment in this thread. For best results, don't include anything else in the comment!
These checklists contain the JOSS requirements. As you go over the submission, please check any items that you feel have been satisfied. The first comment in this thread also contains links to the JOSS reviewer guidelines.
The JOSS review is different from most other journals. Our goal is to work with the authors to help them meet our criteria instead of merely passing judgment on the submission. As such, the reviewers are encouraged to submit issues and pull requests on the software repository. When doing so, please mention openjournals/joss-reviews#6389
so that a link is created to this thread (and I can keep an eye on what is happening). Please also feel free to comment and ask questions on this thread. In my experience, it is better to post comments/questions/suggestions as you come across them instead of waiting until you've reviewed the entire package.
We aim for reviews to be completed within about 2-4 weeks. Please let me know if you require some more time.
Please feel free to ping me (@mikemahoney218) if you have any questions/concerns. Thanks again so much for agreeing to review!
Reference check summary (note 'MISSING' DOIs are suggestions that need verification):
OK DOIs
- 10.1371/journal.pone.0282524 is OK
- 10.32614/RJ-2018-009 is OK
MISSING DOIs
- 10.1002/2015wr018349 may be a valid DOI for title: Spatial statistical network models for stream and river temperature in New England, USA
- 10.1016/j.jtherbio.2021.103028 may be a valid DOI for title: Integrating thermal infrared stream temperature imagery and spatial stream network models to understand natural spatial thermal variability in streams
- 10.1111/rec.13626 may be a valid DOI for title: Riparian vegetation shade restoration and loss effects on recent and future stream temperatures
- 10.1111/1752-1688.12372 may be a valid DOI for title: The Stream-Catchment (StreamCat) Dataset: A database of watershed metrics for the conterminous United States
- 10.1139/cjfas-2016-0247 may be a valid DOI for title: Scalable population estimates using spatial-stream-network (SSN) models, fish density surveys, and national geospatial database frameworks for streams
- 10.1371/journal.pone.0239237 may be a valid DOI for title: Preparing GIS data for analysis of stream monitoring data: The R package openSTARS
- 10.1086/710340 may be a valid DOI for title: Variation in stream network relationships and geospatial predictions of watershed conductivity
- 10.1016/j.scitotenv.2017.08.151 may be a valid DOI for title: Using spatial-stream-network models and long-term data to understand and predict dynamics of faecal contamination in a mixed land-use catchment
- 10.1371/journal.pone.0238422 may be a valid DOI for title: SSNdesign — An R package for pseudo-Bayesian optimal and adaptive sampling designs on stream networks
- 10.1016/j.cageo.2004.03.012 may be a valid DOI for title: Multivariable geostatistics in S: the gstat package
- 10.1890/08-1668.1 may be a valid DOI for title: A mixed-model moving-average approach to geostatistical modeling in stream networks
- 10.1038/s41598-019-43132-7 may be a valid DOI for title: A spatial stream-network approach assists in managing the remnant genetic diversity of riparian forests
- 10.1007/s10021-018-0311-8 may be a valid DOI for title: Estimating ecosystem metabolism to entire river networks
- 10.1111/1752-1688.12543 may be a valid DOI for title: Improving predictive models of in-stream phosphorus concentration based on nationally-available spatial data coverages
- 10.1111/1365-2664.13997 may be a valid DOI for title: Dendritic prioritization through spatial stream network modeling informs targeted management of Himalayan riverscapes under brown trout invasion
- 10.1016/j.cageo.2014.02.009 may be a valid DOI for title: rtop: An R package for interpolation of data with a variable spatial support, with an example from river networks
- 10.1198/jasa.2009.ap08248 may be a valid DOI for title: A moving average approach for spatial statistical models of stream networks
INVALID DOIs
- None
Thank you @mikemahoney218 , @fernandomayer, and @k-doering-NOAA ! If anyone has any questions that come up that I can help with, please don't hesitate to reach out.
@editorialbot add @fawda123 as reviewer
All three reviewers I reached out to accepted, which is fantastic! While we can have a review with 2 reviewers, 3 is ideal, so I'm going ahead and adding @fawda123 as a reviewer as well. Thanks again so much for agreeing to review!
@fawda123 added to the reviewers list!
Thank you @fawda123 !
Hi all! Just wanted to bump this thread now that we're about two weeks into the review window.
@fernandomayer, @k-doering-NOAA, @fawda123 : note that when you post this comment on this thread:
@editorialbot generate my checklist
You'll get a checklist generated containing all the elements we're asking you to look over as part of your review. Please let me know if you've got any questions/comments/concerns regarding the review!
@michaeldumelle , I should have mentioned this earlier, but if you want to take a look at those "MISSING DOIs" in the Editorialbot message above -- assuming those DOIs correspond to your actual citations, please go ahead and add them to your bibtex file (eg doi = "10.1002/2015wr018349"
). No rush, but they'll need to get fixed before we'd accept the paper.
@mikemahoney218 I updated the bibtex file, incorporating the aforementioned DOIs and adding a few more. I pushed the changes to the joss branch, and you can review the commit here. Please let me know when you need anything else from me. Thank you!
@editorialbot check references
Reference check summary (note 'MISSING' DOIs are suggestions that need verification):
OK DOIs
- 10.1007/978-1-4614-7618-4 is OK
- 10.1002/9781119115151 is OK
- 10.1002/2015wr018349 is OK
- 10.1371/journal.pone.0282524 is OK
- 10.1016/j.jtherbio.2021.103028 is OK
- 10.1111/rec.13626 is OK
- 10.1111/1752-1688.12372 is OK
- 10.1002/2017WR020969 is OK
- 10.1139/cjfas-2016-0247 is OK
- 10.1371/journal.pone.0239237 is OK
- 10.18637/jss.v063.i19 is OK
- 10.1086/710340 is OK
- 10.1016/j.scitotenv.2017.08.151 is OK
- 10.1016/j.cageo.2004.03.012 is OK
- 10.32614/RJ-2018-009 is OK
- 10.1890/08-1668.1 is OK
- 10.18637/jss.v056.i02 is OK
- 10.1111/j.1523-1739.2012.01897.x is OK
- 10.1038/s41598-019-43132-7 is OK
- 10.1007/s10021-018-0311-8 is OK
- 10.48550/arxiv.2110.02507 is OK
- 10.1111/1752-1688.12543 is OK
- 10.1111/1365-2664.13997 is OK
- 10.1016/j.cageo.2014.02.009 is OK
- 10.1198/jasa.2009.ap08248 is OK
- 10.18637/jss.v056.i03 is OK
- 10.1007/978-0-387-98141-3 is OK
MISSING DOIs
- None
INVALID DOIs
- None
Review checklist for @fawda123
Conflict of interest
- [x] I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.
Code of Conduct
- [x] I confirm that I read and will adhere to the JOSS code of conduct.
General checks
- [x] Repository: Is the source code for this software available at the https://github.com/USEPA/SSN2?
- [x] License: Does the repository contain a plain-text LICENSE or COPYING file with the contents of an OSI approved software license?
- [x] Contribution and authorship: Has the submitting author (@michaeldumelle) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?
- [x] Substantial scholarly effort: Does this submission meet the scope eligibility described in the JOSS guidelines
- [x] Data sharing: If the paper contains original data, data are accessible to the reviewers. If the paper contains no original data, please check this item.
- [x] Reproducibility: If the paper contains original results, results are entirely reproducible by reviewers. If the paper contains no original results, please check this item.
- [x] Human and animal research: If the paper contains original data research on humans subjects or animals, does it comply with JOSS's human participants research policy and/or animal research policy? If the paper contains no such data, please check this item.
Functionality
- [x] Installation: Does installation proceed as outlined in the documentation?
- [x] Functionality: Have the functional claims of the software been confirmed?
- [x] Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)
Documentation
- [x] A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
- [x] Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
- [x] Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
- [x] Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
- [x] Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
- [x] Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support
Software paper
- [x] Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
- [x] A statement of need: Does the paper have a section titled 'Statement of need' that clearly states what problems the software is designed to solve, who the target audience is, and its relation to other work?
- [x] State of the field: Do the authors describe how this software compares to other commonly-used packages?
- [x] Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
- [x] References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?
@michaeldumelle I've finished my initial review of the package and paper, nice work! These package updates are critical to maintain relevancy of "legacy" software as R continues to develop. It's nice to see the effort put into this work. I've added a few issues (https://github.com/USEPA/SSN2/issues/11, https://github.com/USEPA/SSN2/issues/12, https://github.com/USEPA/SSN2/issues/13, https://github.com/USEPA/SSN2/issues/14, https://github.com/USEPA/SSN2/issues/15) in the main repo for your consideration. I think the biggest ask is updates to your unit tests. Let me know if you have any questions!
@fawda123 thank you so much for the kind words about the software and for the thorough and helpful review! I really appreciate the time you put into this, and I look forward to incorporating all of your feedback once the remaining reviews come in. I will reach out if I have any clarifying questions.
@mikemahoney218 SSN2 currently has three branches: 1) main, which is up to date with CRAN; 2) which is the development version that is ahead of CRAN; and 3) joss, which is up to date with main but has the joss paper in it. When incorporating feedback on the software, can I push changes to the development branch with the understanding that the changes will be merged into main alongside the next CRAN update? And when incorporating feedback on the paper, can I push changes to the joss branch?
Life will be easier if you can merge the development branch into the JOSS branch -- in particular, I think it will be easier for reviewers if there's one branch containing all of the most up-to-date revisions to the code and the paper (and it will also be better when we move to accepting the package and need an archive and DOI). Is that possible?
@mikemahoney218 definitely! How does this plan sound? Once I get reviews, I will address code feedback in the development branch and paper feedback in the joss branch. Once I am done addressing code and paper feedback, I will merge the development branch into the JOSS branch so the joss branch has the most up-to-date revisions to both code and the paper.
So long as you can make it clear to reviewers what's living where -- especially as @fernandomayer and @k-doering-NOAA are still going to do their initial pass. It's normal for these reviews to involve a lot of back and forth and conversation, much much more so than a traditional peer review where the reviewers only look at the document when you're done working on it. We're fully expecting something closer to a co-production workflow, where reviewers are looking at your code while you're incorporating changes from their reviews and the other reviewers!
That's why I think it'd be better if you can have one branch that's up to date, rather than splitting changes out across branches. I'm not sure what advantage separating code and paper changes has, though!
@mikemahoney218 I had planned to leave the joss branch frozen with the software version at the time of the article's (hopeful) publication, which is why all the hassle with the separate branches (as the development branch will eventually be merged into main/CRAN). Do you recommend I add all the joss files to the development branch and eventually merge it into main, placing all the joss files in .Rbuildignore? Then I can delete the joss branch?
I personally think that will be the easiest for everyone to follow. We'll be capturing the software at time of acceptance as a Zenodo archive from your repository (associated with a new GH release), so you'll need to have everything in one place by then anyway, and will have the release tag as a permanent record on your repository of "the software as accepted by JOSS". So I think having everything in one branch, with the paper in .Rbuildignore
, makes the most sense.
@mikemahoney218 Sounds like a plan, and I appreciate the advice! I will start on these changes once I have the initial round of reviews back from everyone.
Review checklist for @k-doering-NOAA
Conflict of interest
- [x] I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.
Code of Conduct
- [x] I confirm that I read and will adhere to the JOSS code of conduct.
General checks
- [x] Repository: Is the source code for this software available at the https://github.com/USEPA/SSN2?
- [x] License: Does the repository contain a plain-text LICENSE or COPYING file with the contents of an OSI approved software license?
- [x] Contribution and authorship: Has the submitting author (@michaeldumelle) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?
- [x] Substantial scholarly effort: Does this submission meet the scope eligibility described in the JOSS guidelines
- [x] Data sharing: If the paper contains original data, data are accessible to the reviewers. If the paper contains no original data, please check this item.
- [x] Reproducibility: If the paper contains original results, results are entirely reproducible by reviewers. If the paper contains no original results, please check this item.
- [x] Human and animal research: If the paper contains original data research on humans subjects or animals, does it comply with JOSS's human participants research policy and/or animal research policy? If the paper contains no such data, please check this item.
Functionality
- [x] Installation: Does installation proceed as outlined in the documentation?
- [x] Functionality: Have the functional claims of the software been confirmed?
- [x] Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)
Documentation
- [x] A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
- [x] Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
- [x] Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
- [x] Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
- [x] Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
- [x] Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support
Software paper
- [x] Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
- [x] A statement of need: Does the paper have a section titled 'Statement of need' that clearly states what problems the software is designed to solve, who the target audience is, and its relation to other work?
- [x] State of the field: Do the authors describe how this software compares to other commonly-used packages?
- [x] Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
- [x] References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?
@mikemahoney218 I wanted to report one potential COI: Jay Ver Hoef and I are both employed by NOAA Fisheries, although at different offices (I'm at the Office of Science and Technology, while Jay is at the Alaska Fisheries Science Center). To my knowledge, we've never collaborated or even met.
Is it ok to continue my review?
I think so! The JOSS documentation says that we can waive a COI if:
you and a submitter are both employed by the same very large organization but in different units without any knowledge of each other.
This sounds like a clearcut case of that to me! If you're still able to make an impartial assessment of the work, then I think it makes total sense for you to continue your review.
@mikemahoney218 thanks - I thought it could be waived as well upon seeing that line in the JOSS documentation, but wanted to report just in case. I am able to make an impartial assessment of the work. I'll continue reviewing, then, thanks for confirming!
Fantastic! Thanks for reporting 😄
@michaeldumelle et al., fantastic work! A nicely written article and R package. I just completed the first pass through my checklist.
I did feel there were two small pieces of documentation missing, so I postponed checking 2 of the checkboxes for now. Once the missing documentation has been added, I can check them off.
I added edits to the paper and one suggestion, but these are non-blocking. I also made a few comments on issues posted by @fawda123, in case they can be of some help.
@k-doering-NOAA thanks so much for the kind words and the helpful feedback! I really appreciate your hard work on this and will start incorporating your suggestions once I get the final review back. I will reach out if I have any clarifying questions.