joss-reviews icon indicating copy to clipboard operation
joss-reviews copied to clipboard

[REVIEW]: ndbc-api: Accelerating oceanography and climate science research with Python

Open editorialbot opened this issue 1 year ago • 8 comments

Submitting author: @CDJellen (Chris Jellen) Repository: http://github.com/cdjellen/ndbc-api Branch with paper.md (empty if default branch): user/cjellen/joss-paper-submission Version: v2024.08.31.1 Editor: @cheginit Reviewers: @rwegener2, @ks905383 Archive: Pending

Status

status

Status badge code:

HTML: <a href="https://joss.theoj.org/papers/7650fcdcf5309f37067b9f271f12e438"><img src="https://joss.theoj.org/papers/7650fcdcf5309f37067b9f271f12e438/status.svg"></a>
Markdown: [![status](https://joss.theoj.org/papers/7650fcdcf5309f37067b9f271f12e438/status.svg)](https://joss.theoj.org/papers/7650fcdcf5309f37067b9f271f12e438)

Reviewers and authors:

Please avoid lengthy details of difficulties in the review thread. Instead, please create a new issue in the target repository and link to those issues (especially acceptance-blockers) by leaving comments in the review thread below. (For completists: if the target issue tracker is also on GitHub, linking the review thread in the issue or vice versa will create corresponding breadcrumb trails in the link target.)

Reviewer instructions & questions

@rwegener2 & @ks905383, your review will be checklist based. Each of you will have a separate checklist that you should update when carrying out your review. First of all you need to run this command in a separate comment to create the checklist:

@editorialbot generate my checklist

The reviewer guidelines are available here: https://joss.readthedocs.io/en/latest/reviewer_guidelines.html. Any questions/concerns please let @cheginit know.

Please start on your review when you are able, and be sure to complete your review in the next six weeks, at the very latest

Checklists

📝 Checklist for @ks905383

editorialbot avatar Oct 24 '24 16:10 editorialbot

Hello humans, I'm @editorialbot, a robot that can help you with some common editorial tasks.

For a list of things I can do to help you, just type:

@editorialbot commands

For example, to regenerate the paper pdf after making changes in the paper's md or bib files, type:

@editorialbot generate pdf

editorialbot avatar Oct 24 '24 16:10 editorialbot

Software report:

github.com/AlDanial/cloc v 1.90  T=1.34 s (134.6 files/s, 685439.9 lines/s)
-------------------------------------------------------------------------------
Language                     files          blank        comment           code
-------------------------------------------------------------------------------
YAML                            39         374062             43         504544
Python                         135           1337            518           4983
Markdown                         2             69              0            214
Jupyter Notebook                 1              0          30859             61
TeX                              1              3              0             31
TOML                             1              3              0             27
INI                              1              0              0              4
-------------------------------------------------------------------------------
SUM:                           180         375474          31420         509864
-------------------------------------------------------------------------------

Commit count by author:

    95	CDJellen
    41	cdjellen
    16	Chris Jellen
     1	abdu558

editorialbot avatar Oct 24 '24 16:10 editorialbot

Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

✅ OK DOIs

- 10.25080/majora-92bf1922-00a is OK

🟡 SKIP DOIs

- No DOI given, and none found for title: NDBC Web Data Guide
- No DOI given, and none found for title: NDBC Active Stations
- No DOI given, and none found for title: NetCDF4 Python Library

❌ MISSING DOIs

- None

❌ INVALID DOIs

- None

editorialbot avatar Oct 24 '24 16:10 editorialbot

Paper file info:

📄 Wordcount for paper.md is 600

✅ The paper includes a Statement of need section

editorialbot avatar Oct 24 '24 16:10 editorialbot

License info:

✅ License found: MIT License (Valid open source OSI approved license)

editorialbot avatar Oct 24 '24 16:10 editorialbot

👋🏼 @CDJellen, @rwegener2, and @ks905383, this is the review thread for the paper. All of our communications will happen here from now on.

As a reviewer, the first step, as mentioned in the first comment of this issue, is to create a checklist for your review by entering

@editorialbot generate my checklist

as the top of a new comment in this thread.

These checklists contain the JOSS requirements. As you go over the submission, please check any items that you feel have been satisfied. The first comment in this thread also contains links to the JOSS reviewer guidelines.

The JOSS review is different from most other journals. Our goal is to work with the authors to help them meet our criteria instead of merely passing judgment on the submission. As such, the reviewers are encouraged to submit issues and pull requests on the software repository. When doing so, please mention openjournals/joss-reviews#7406 so that a link is created to this thread (and I can keep an eye on what is happening). Please also feel free to comment and ask questions on this thread. In my experience, it is better to post comments/questions/suggestions as you come across them, instead of waiting until you've reviewed the entire package.

We aim for reviews to be completed within about 2-4 weeks. Please notify me if any of you require some more time. We can also use EditorialBot (our bot) to set automatic reminders if you know you'll be away for a known period of time.

Please don't hesitate to ping me (@cheginit) if you have any questions/concerns.

cheginit avatar Oct 24 '24 16:10 cheginit

:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:

editorialbot avatar Oct 24 '24 16:10 editorialbot

Review checklist for @ks905383

Conflict of interest

  • [x] I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.

Code of Conduct

General checks

  • [x] Repository: Is the source code for this software available at the http://github.com/cdjellen/ndbc-api?
  • [x] License: Does the repository contain a plain-text LICENSE or COPYING file with the contents of an OSI approved software license?
  • [x] Contribution and authorship: Has the submitting author (@CDJellen) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?
  • [x] Substantial scholarly effort: Does this submission meet the scope eligibility described in the JOSS guidelines
  • [x] Data sharing: If the paper contains original data, data are accessible to the reviewers. If the paper contains no original data, please check this item.
  • [x] Reproducibility: If the paper contains original results, results are entirely reproducible by reviewers. If the paper contains no original results, please check this item.
  • [x] Human and animal research: If the paper contains original data research on humans subjects or animals, does it comply with JOSS's human participants research policy and/or animal research policy? If the paper contains no such data, please check this item.

Functionality

  • [x] Installation: Does installation proceed as outlined in the documentation?
  • [x] Functionality: Have the functional claims of the software been confirmed?
  • [x] Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • [x] A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • [x] Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • [x] Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • [x] Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • [x] Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
  • [x] Community guidelines: Are there clear guidelines for third parties wishing to 1. Contribute to the software 2. Report issues or problems with the software 3. Seek support

Software paper

  • [x] Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
  • [x] A statement of need: Does the paper have a section titled 'Statement of need' that clearly states what problems the software is designed to solve, who the target audience is, and its relation to other work?
  • [x] State of the field: Do the authors describe how this software compares to other commonly-used packages?
  • [x] Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
  • [x] References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?

ks905383 avatar Oct 24 '24 16:10 ks905383

Review checklist for @rwegener2

Conflict of interest

  • [x] I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.

Code of Conduct

General checks

  • [x] Repository: Is the source code for this software available at the http://github.com/cdjellen/ndbc-api?
  • [x] License: Does the repository contain a plain-text LICENSE or COPYING file with the contents of an OSI approved software license?
  • [x] Contribution and authorship: Has the submitting author (@CDJellen) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?
  • [x] Substantial scholarly effort: Does this submission meet the scope eligibility described in the JOSS guidelines
  • [x] Data sharing: If the paper contains original data, data are accessible to the reviewers. If the paper contains no original data, please check this item.
  • [x] Reproducibility: If the paper contains original results, results are entirely reproducible by reviewers. If the paper contains no original results, please check this item.
  • [x] Human and animal research: If the paper contains original data research on humans subjects or animals, does it comply with JOSS's human participants research policy and/or animal research policy? If the paper contains no such data, please check this item.

Functionality

  • [x] Installation: Does installation proceed as outlined in the documentation?
  • [x] Functionality: Have the functional claims of the software been confirmed?
  • [x] Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • [x] A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • [x] Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • [x] Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • [x] Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • [x] Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
  • [x] Community guidelines: Are there clear guidelines for third parties wishing to 1. Contribute to the software 2. Report issues or problems with the software 3. Seek support

Software paper

  • [x] Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
  • [x] A statement of need: Does the paper have a section titled 'Statement of need' that clearly states what problems the software is designed to solve, who the target audience is, and its relation to other work?
  • [x] State of the field: Do the authors describe how this software compares to other commonly-used packages?
  • [x] Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
  • [x] References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?

rwegener2 avatar Oct 27 '24 11:10 rwegener2

👋🏼 @rwegener2, @ks905383 a friendly reminder for this review.

cheginit avatar Nov 11 '24 15:11 cheginit

Thanks for the reminder - will look at this this week.

ks905383 avatar Nov 12 '24 18:11 ks905383

Bildschirmfoto 2024-11-13 um 4 00 20 PM

@CDJellen Could you clarify something on the modes? The docs say that the different modes '[correspond] to the data formats provided by the NDBC data service'. I'm not quite sure how that maps onto the data. I see that some of the modes carry some of the same variables, but that the same timestamp + station can give different values for those variables (like WDIR, WSPD, and GST in the example above). Would a user more familiar with this dataset know the difference between those two rows, or should there be a flag / column (/ df index) specifying which data format each row came from? (apologies, I haven't worked with this specific dataset before)

Thanks!

ks905383 avatar Nov 13 '24 21:11 ks905383

Thank you so much for your thorough review @ks905383; the "modes" that the API supports map directly to the data modalities outlined in the web data guide. While I believe most users will have familiarity with these modes and formats, you raise an excellent point with respect to columns that are included in multiple formats. In cases where a user requests more than one mode through the get_data method, including the modality as a prefix or suffix seems appropriate. I will make this change over the next few days.

I also very much appreciate the issues you opened in the ndbc-api repository, the suggestions and samples were excellent.

CDJellen avatar Nov 18 '24 22:11 CDJellen

Great! I only have two minor comments left (listed below), otherwise I recommend acceptance. Thanks for the work - it's always great to improve access to the often very janky online datastores of climate datasets....

  1. Could you add conda/mamba install instructions to the README? Since it's on conda-forge anyways, might as well advertise it.
  2. Would you consider making python 3.13 supported? I don't think too much changed that would mess with dependencies / tests (though would be easy to check by running the test package on 3.13 as well), but empty new environments are installing that python version, and if ndbc-api isn't specified in the initial environment, it'll install an older version instead (presumably the last one that didn't explicitly specify python requirements).

ks905383 avatar Nov 19 '24 15:11 ks905383

Thank you @ks905383 ! I've updated the README with conda instructions; great point there as this was a relatively recent change.

With respect to python 3.13 support, I've spent some time investigating this but it seems some dependencies (at least as reported through poetry) lack support for a wide enough version set. With that said, I've updated CI to include python 3.12 support in tests, and updated the package dependencies for 3.12 compatibility.

I will revisit support for 3.13 in a few weeks once the package dependencies have more recent updates.

Once again, thank you for taking the time to review and offer feedback on the package. Your suggestions were excellent, the migration to xarray is especially useful given how much cleaner that API is when compared to netCDF4. Have an excellent rest of your day!

CDJellen avatar Nov 19 '24 22:11 CDJellen

Great, @cheginit that concludes my review, I recommend acceptance.

ks905383 avatar Nov 20 '24 18:11 ks905383

@ks905383 Thanks for your time and efforts in reviewing the submission and providing constructive comments, appreciate it!

cheginit avatar Nov 20 '24 19:11 cheginit

@cheginit thanks for the reminder and sorry for the delay. I'll wrap this up by end of day tomorrow.

rwegener2 avatar Nov 21 '24 12:11 rwegener2

Hi @CDJellen, this is a great package you've built! Python-based access to NDBC data will help lots of folks use their data more easily.

I have comments on two aspects of this project so far:

Tests

  • When I run the tests I get 21 passed, 99 skipped, and 1 warning. The only two test files that ran were tests/config/test_config.py and tests/test_ndbc_api.py. Is there a reason most of the tests are skipping, or am I making a mistake in how I'm running them?

Paper

Well written paper! A few points of feedback:

  • Line 10: The phrase "file-based access methods" confused me a bit, but made more sense once I looked at your code. To make this clearer from the perspective of someone only reading the paper I suggest either 1) framing the challenge with the "file-based methods" as the fact that NDBC only provides an API but no Python wrapper and/or that the ASCII text files are not easy to parse, or 2) specifying in the next sentence that this package provides a user-friendly Python API that provides direct access to data as Python data structures.
  • Line 26: You're addressing an important problem here but I think it gets glazed over in "the mode of access adds cost and complexity to their workflows". I'd either clarify how the current system adds cost (or remove that part) and also specify how it adds complexity. Personally, I would also note that there isn't a Python interface, only an API (as I understand it), downloading otherwise happens in a GUI, and that it is difficult to parse the text files.
  • Line 29: typo "these critical gap"
  • Line 33: I struggle with the word "modalities" as it is quite vague. I suggest either 1) after the word modalities include examples or a definition (maybe: "... stations, data modalitites (ex. wind speed, ocean temperature, etc.), ..." or 2) switching to another word/phrase such as 'data variables' or 'measurements'
  • Line 34: with the suggestion from the previous reviewer should an xarray DataSet also be part of this list or objects?
  • Line 34: The pandas team requests that you include the zenodo repository in addition to the McKinney article when citing pandas https://pandas.pydata.org/about/citing.html
  • Line 47: typo, missing 'and' in "... their teams, their network ..."

rwegener2 avatar Nov 21 '24 18:11 rwegener2

Thank you so much for the feedback @rwegener2 ; I will work to address your comments on the paper over the next two days and update this PR once those changes are ready.

In terms of test coverage, some tests are expensive to run or test private methods. In order to get a good sense of the CI covrage, tests can be run with pytest using the --run-slow and --run-private flags. These gate the vast majority of the tests. Tests must pass in order for a PR to merge to main.

I very much appreciate your time and assistance in improving the paper; have an excellent rest of your day.

CDJellen avatar Nov 22 '24 19:11 CDJellen

Thanks for the clarification about tests @CDJellen.

I have left all my comments from the review in the PR/issue above. The software works really nicely, the comments are mostly about bumps I had as I was learning to use the software from the docs and example notebook. Feel free to ping me here or directly in the issues if anything is unclear!

rwegener2 avatar Nov 23 '24 01:11 rwegener2

  • Line 10: The phrase "file-based access methods" confused me a bit, but made more sense once I looked at your code. To make this clearer from the perspective of someone only reading the paper I suggest either 1) framing the challenge with the "file-based methods" as the fact that NDBC only provides an API but no Python wrapper and/or that the ASCII text files are not easy to parse, or 2) specifying in the next sentence that this package provides a user-friendly Python API that provides direct access to data as Python data structures.
  • Line 26: You're addressing an important problem here but I think it gets glazed over in "the mode of access adds cost and complexity to their workflows". I'd either clarify how the current system adds cost (or remove that part) and also specify how it adds complexity. Personally, I would also note that there isn't a Python interface, only an API (as I understand it), downloading otherwise happens in a GUI, and that it is difficult to parse the text files.
  • Line 29: typo "these critical gap"
  • Line 33: I struggle with the word "modalities" as it is quite vague. I suggest either 1) after the word modalities include examples or a definition (maybe: "... stations, data modalitites (ex. wind speed, ocean temperature, etc.), ..." or 2) switching to another word/phrase such as 'data variables' or 'measurements'
  • Line 34: with the suggestion from the previous reviewer should an xarray DataSet also be part of this list or objects?
  • Line 34: The pandas team requests that you include the zenodo repository in addition to the McKinney article when citing pandas https://pandas.pydata.org/about/citing.html
  • Line 47: typo, missing 'and' in "... their teams, their network ..."

Thank you for these comments; I agree with each of them and have made the suggested changes. One more change to call out is that between submitting the paper for review and today, the package migrated from the netCDF4 package to using xarray for opendap/netcdf data retrieval. I've adjusted the paper content and references to highlight this.

I very much appreciate the feedback and review!

These changes were pushed to the user/cjellen/joss-paper-submission branch.

CDJellen avatar Nov 23 '24 03:11 CDJellen

Updates look good @CDJellen! The paper is really nice, the contributing.md is a solid addition, and I think the clarifications in the example notebook and readme will really help your users.

This package looks good to me. @cheginit I recommend acceptance.

rwegener2 avatar Nov 26 '24 15:11 rwegener2

@rwegener2 Thank you for your time and effort in reviewing this submission and providing constructive comments, appreciate it!

cheginit avatar Dec 02 '24 14:12 cheginit

Post-Review Checklist for Editor and Authors

Additional Author Tasks After Review is Complete

  • Double check authors and affiliations (including ORCIDs)
  • Make a release of the software with the latest changes from the review and post the version number here. This is the version that will be used in the JOSS paper.
  • Archive the release on Zenodo/figshare/etc and post the DOI here.
  • Make sure that the title and author list (including ORCIDs) in the archive match those in the JOSS paper.
  • Make sure that the license listed for the archive is the same as the software license.

Editor Tasks Prior to Acceptance

  • [ ] Read the text of the paper and offer comments/corrections (as either a list or a pull request)
  • [ ] Check that the archive title, author list, version tag, and the license are correct
  • [ ] Set archive DOI with @editorialbot set <DOI here> as archive
  • [ ] Set version with @editorialbot set <version here> as version
  • [ ] Double check rendering of paper with @editorialbot generate pdf
  • [ ] Specifically check the references with @editorialbot check references and ask author(s) to update as needed
  • [ ] Recommend acceptance with @editorialbot recommend-accept

cheginit avatar Dec 02 '24 14:12 cheginit

@editorialbot generate pdf

cheginit avatar Dec 02 '24 14:12 cheginit

@editorialbot check references

cheginit avatar Dec 02 '24 14:12 cheginit

Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

✅ OK DOIs

- 10.25080/majora-92bf1922-00a is OK
- 10.5281/zenodo.3509134 is OK
- 10.5334/jors.148 is OK

🟡 SKIP DOIs

- No DOI given, and none found for title: NDBC Web Data Guide
- No DOI given, and none found for title: NDBC Active Stations

❌ MISSING DOIs

- None

❌ INVALID DOIs

- None

editorialbot avatar Dec 02 '24 14:12 editorialbot

:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:

editorialbot avatar Dec 02 '24 14:12 editorialbot

@CDJellen Thanks for working with the reviewers and addressing their comments. Before, I hand it over to the EiC for the final publication, please address the following:

Paper:

  • [x] Please change "meterological" to "meteorological". Note that there are two instances.
  • [ ] There are several instances of the word "Python" that are not capitalized, e.g., Line 9. There is also one in the references.
  • [x] L10-14: This is a long sentence, please break it down to two.
  • [ ] In the references section, the Pandas item should be pandas/pandas-dev - Pandas and its author should be The pandas development team (you need to add additional curly braces to this item in the bib file.)

Regarding the branch, I noticed that there are two JOSS related branches. Once you addressed my comments, please make sure to release a new version with all the commits that you want to be included in your JOSS publication. Then provide me its version number and an archive DOI (e.g., on Zenodo).

cheginit avatar Dec 02 '24 14:12 cheginit