joss-reviews icon indicating copy to clipboard operation
joss-reviews copied to clipboard

[REVIEW]: floodlight - A high-level, data-driven sports analytics framework

Open editorialbot opened this issue 1 year ago • 23 comments

Submitting author: @draabe (Dominik Raabe) Repository: https://github.com/floodlight-sports/floodlight Branch with paper.md (empty if default branch): paper Version: 0.3.2 Editor: @crvernon Reviewers: @gagolews, @kanishkan91 Archive: Pending

Status

status

Status badge code:

HTML: <a href="https://joss.theoj.org/papers/4316da64910988e4d7b148fe3ed5db96"><img src="https://joss.theoj.org/papers/4316da64910988e4d7b148fe3ed5db96/status.svg"></a>
Markdown: [![status](https://joss.theoj.org/papers/4316da64910988e4d7b148fe3ed5db96/status.svg)](https://joss.theoj.org/papers/4316da64910988e4d7b148fe3ed5db96)

Reviewers and authors:

Please avoid lengthy details of difficulties in the review thread. Instead, please create a new issue in the target repository and link to those issues (especially acceptance-blockers) by leaving comments in the review thread below. (For completists: if the target issue tracker is also on GitHub, linking the review thread in the issue or vice versa will create corresponding breadcrumb trails in the link target.)

Reviewer instructions & questions

@gagolews, your review will be checklist based. Each of you will have a separate checklist that you should update when carrying out your review. First of all you need to run this command in a separate comment to create the checklist:

@editorialbot generate my checklist

The reviewer guidelines are available here: https://joss.readthedocs.io/en/latest/reviewer_guidelines.html. Any questions/concerns please let @crvernon know.

Please start on your review when you are able, and be sure to complete your review in the next six weeks, at the very latest

Checklists

📝 Checklist for @gagolews

📝 Checklist for @kanishkan91

editorialbot avatar Jul 18 '22 16:07 editorialbot

Hello humans, I'm @editorialbot, a robot that can help you with some common editorial tasks.

For a list of things I can do to help you, just type:

@editorialbot commands

For example, to regenerate the paper pdf after making changes in the paper's md or bib files, type:

@editorialbot generate pdf

editorialbot avatar Jul 18 '22 16:07 editorialbot

Software report:

github.com/AlDanial/cloc v 1.88  T=0.12 s (894.2 files/s, 125603.9 lines/s)
-------------------------------------------------------------------------------
Language                     files          blank        comment           code
-------------------------------------------------------------------------------
Python                          49           1828           3627           6475
reStructuredText                41            616            696            589
Markdown                         8            160              0            488
TeX                              1             20              0            268
YAML                             5             21             10            122
TOML                             1              9              1             74
make                             1              4              7              9
CSS                              1              0              1              4
-------------------------------------------------------------------------------
SUM:                           107           2658           4342           8029
-------------------------------------------------------------------------------


gitinspector failed to run statistical information for the repository

editorialbot avatar Jul 18 '22 16:07 editorialbot

Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

OK DOIs

- 10.1145/3475722.3482792 is OK
- 10.2165/00007256-200838030-00005 is OK
- 10.1080/02640410903503640 is OK
- 10.48550/ARXIV.1309.0238 is OK
- 10.1007/s40279-014-0144-3 is OK
- 10.1055/a-0592-7660 is OK
- 10.1080/17461391.2020.1747552 is OK
- 10.1038/s41586-020-2649-2 is OK
- 10.1177/1747954119879350 is OK
- 10.1109/MCSE.2007.55 is OK
- 10.1080/02640414.2012.746720 is OK
- 10.25080/Majora-92bf1922-00a is OK
- 10.4324/9781351210164 is OK
- 10.4324/9781003160953 is OK
- 10.1007/s41060-017-0093-7 is OK
- 10.1073/pnas.88.6.2297 is OK
- 10.1186/s40064-016-3108-2 is OK
- 10.1055/s-0031-1301320 is OK
- 10.3390/data2010002 is OK
- 10.1038/s41592-019-0686-2 is OK

MISSING DOIs

- None

INVALID DOIs

- None

editorialbot avatar Jul 18 '22 16:07 editorialbot

Wordcount for paper.md is 1411

editorialbot avatar Jul 18 '22 16:07 editorialbot

:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:

editorialbot avatar Jul 18 '22 16:07 editorialbot

👋 @draabe , @gagolews, and @kanishkan91 This is the review thread for the paper. All of our communications will happen here from now on.

Please read the "Reviewer instructions & questions" in the first comment above.

Both reviewers have checklists at the top of this thread (in that first comment) with the JOSS requirements. As you go over the submission, please check any items that you feel have been satisfied. There are also links to the JOSS reviewer guidelines.

The JOSS review is different from most other journals. Our goal is to work with the authors to help them meet our criteria instead of merely passing judgment on the submission. As such, the reviewers are encouraged to submit issues and pull requests on the software repository. When doing so, please mention https://github.com/openjournals/joss-reviews/issues/4588 so that a link is created to this thread (and I can keep an eye on what is happening). Please also feel free to comment and ask questions on this thread. In my experience, it is better to post comments/questions/suggestions as you come across them instead of waiting until you've reviewed the entire package.

We aim for the review process to be completed within about 4-6 weeks but please make a start well ahead of this as JOSS reviews are by their nature iterative and any early feedback you may be able to provide to the author will be very helpful in meeting this schedule.

crvernon avatar Jul 18 '22 16:07 crvernon

@editorialbot add @kanishkan91 as reviewer

crvernon avatar Jul 18 '22 16:07 crvernon

@kanishkan91 added to the reviewers list!

editorialbot avatar Jul 18 '22 16:07 editorialbot

Review checklist for @gagolews

Conflict of interest

  • [x] I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.

Code of Conduct

General checks

  • [x] Repository: Is the source code for this software available at the https://github.com/floodlight-sports/floodlight?
  • [x] License: Does the repository contain a plain-text LICENSE file with the contents of an OSI approved software license?
  • [x] Contribution and authorship: Has the submitting author (@draabe) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?
  • [x] Substantial scholarly effort: Does this submission meet the scope eligibility described in the JOSS guidelines

Functionality

  • [x] Installation: Does installation proceed as outlined in the documentation?
  • [x] Functionality: Have the functional claims of the software been confirmed?
  • [x] Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • [x] A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • [x] Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • [x] Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • [x] Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • [x] Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
  • [x] Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • [x] Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
  • [x] A statement of need: Does the paper have a section titled 'Statement of need' that clearly states what problems the software is designed to solve, who the target audience is, and its relation to other work?
  • [x] State of the field: Do the authors describe how this software compares to other commonly-used packages?
  • [x] Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
  • [x] References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?

gagolews avatar Jul 24 '22 06:07 gagolews

@draabe: Further remarks:

  1. The package description in the article is quite abstract, how about providing some more details, e.g., list all the methods/algorithms implemented? I see that such a list is featured in the README file, so at least this fact should be mentioned in the article.

  2. For the example code on p.2, please include the results generated.

  3. Check all citations: in many places, Surname (2022) should read (Surname, 2022).

  4. Figure Figure 1 -> Figure 1

  5. line 102: scientistS

  6. That you have a few nice tutorials at https://floodlight.readthedocs.io/ should be mentioned explicitly in the paper. This should also be made more explicit in the README file.

Apart from the above, I recommend the paper be accepted for publication in JOSS.

gagolews avatar Jul 24 '22 06:07 gagolews

@editorialbot generate pdf

draabe avatar Aug 02 '22 13:08 draabe

:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:

editorialbot avatar Aug 02 '22 13:08 editorialbot

@gagolews thanks a lot for the positive feedback and helpful remarks! We've updated the article addressing your points:

  1. Yes, totally makes sense to have a more comprehensive description of features in the article. We've added a (grouped) list of all features preceding the more narrative discussion of functionality. This way, there's both a quick summary as well as a more thorough explanation of the features included. We also created a new "Features" section for this, which we feel is a bit clearer, but that could be undone if it conflicts with the JOSS format.

  2. As the code produces an object containing an array of shape (9000, 16), we've added a short explanation and code snippet that prints the most interesting part of the array and shows the result.

  3. Fixed this, we used the wrong separator to list multiple references.

  4. Fixed.

  5. Fixed.

  6. Agreed, we've included that in the feature list of the article (see 1). We also updated our README and introduced a dedicated "Quick Demo" section that shows the code snippet from the article and explicitly refers to the tutorials to (see develop branch).

draabe avatar Aug 02 '22 13:08 draabe

I am happy with the revised version and now recommend the paper be accepted for publication.

gagolews avatar Aug 03 '22 00:08 gagolews

@kanishkan91 will you give me an update to your timeline for completing your portion of the review? Thanks!

crvernon avatar Aug 04 '22 16:08 crvernon

@crvernon my apologies. I had to meet some other deadlines. Will complete this weekend.

kanishkan91 avatar Aug 04 '22 20:08 kanishkan91

No problem at all @kanishkan91 . Thank you!

crvernon avatar Aug 04 '22 21:08 crvernon

@draabe @crvernon I am mostly done with my review. This is clearly a very useful package that does an immense amount of data wrangling. I also learnt a lot about sports data analytics through this review and mainly thanks to the exhaustive documentation. However, there are a few points that I would like the authors to address before I approve this. I have added issues (linked above) that the authors can look at and link to relevant pull requests. The summary of the mian issues are as follows,

  1. Installation instructions, workflow- I see that the authors have used poetry here to take care of the environment issues. However, they recommend installation through pip install. I ran into an issue here where the installation defaulted to an older version of this package since my python version was behind. It took me a while to track down the cause of this issue. I recommend that the authors remove all references to pip install. Instead just ask users to install this using poetry namely poetry install. Secondly, the authors must explicitly state in the documentation that poetry is a replacement for setup.py. Else, users will again be confused. An alternative would be to add setup.py in addition to poetry but that may become a pain to maintain. So, I would recommend the documentation route.

  2. Documentation branch - I see that the documentation generated through sphinx is being hosted on the main branch. This will be a problem and is not sustainable. I would suggest hosting the documentation on a separate branch (See my issue for this for details). The idea is that if users wanted to edit documentation, they should not edit the code in the main branch. This is standard practice as far as I know and would recommend be implemented this way.

  3. Other issues - These are documented in relevant issues, so authors can look at them and respond accordingly.

Once these are addressed would be happy to approve. Once again, thank you for letting me review this and let me know if you have questions.

kanishkan91 avatar Aug 06 '22 21:08 kanishkan91

Review checklist for @kanishkan91

Conflict of interest

  • [x] I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.

Code of Conduct

General checks

  • [x] Repository: Is the source code for this software available at the https://github.com/floodlight-sports/floodlight?
  • [x] License: Does the repository contain a plain-text LICENSE file with the contents of an OSI approved software license?
  • [x] Contribution and authorship: Has the submitting author (@draabe) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?
  • [x] Substantial scholarly effort: Does this submission meet the scope eligibility described in the JOSS guidelines

Functionality

  • [x] Installation: Does installation proceed as outlined in the documentation?
  • [x] Functionality: Have the functional claims of the software been confirmed?
  • [x] Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • [x] A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • [x] Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • [x] Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • [x] Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • [x] Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
  • [x] Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • [x] Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
  • [x] A statement of need: Does the paper have a section titled 'Statement of need' that clearly states what problems the software is designed to solve, who the target audience is, and its relation to other work?
  • [x] State of the field: Do the authors describe how this software compares to other commonly-used packages?
  • [x] Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
  • [x] References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?

kanishkan91 avatar Aug 08 '22 15:08 kanishkan91

@draabe could you provide a status update whenever you have a minute? Thanks!

crvernon avatar Aug 15 '22 12:08 crvernon

@crvernon, I'm currently on vacation (and so are some of the contributors), but I think I can find some time until end of August to address the points raised. Would that work?

@kanishkan91 thanks a lot for the comments! We will address each point in the issues. However, I have one major point I would like to clarify regarding the difficulties you've had installing the package. Can you provide me with more details of the issues you ran into, e.g., which Python version were you using and what were the exact problems?

In general, we rely on poetry for all dependency management during development. I'd argue that this is a fairly standard approach nowadays, which uses the PEP approved pyproject.toml file and resolves some of the problems caused by setup.py's. This has helped creating the same dev environment for all contributors cross-platform, and avoided any dependency issues. On the other hand, we deploy the built package on PyPI so that users can easily download and install the package with pip, also arguably a fairly standard approach (next to conda).

Summing up, you can either use pip install floodlight as the end-user to install (a specific version of) the package as documented in README.md. Or, as a contributor, you can use the poetry install route to create a virtual environment on your machine that sets you up for development (as highlighted in the CONTRIBUTING.md and the extended contributing guide). We're happy to add more documentation on the usage of pip or poetry (and the redundance of setup.py)! It would be helpful for us, however, to get a more precise indication where you ran into issues and where you think this would be appropriate/is currently lacking.

On the other hand, we'd feel a bit uncomfortable to completely deprecate the installation via pip as we feel this really removes our "standard" way of deployment. Instead, this would force all users - whether they want to contribute or not - to manually install the package from source (which is much more difficult and could lead to all sorts of other problems). Also, it prevents any permanent hosting on PyPI.

As this was your no. 1 point and you've mentioned this in multiple issues, it would be helpful for us to clarify this before addressing the resulting issues one by one!

draabe avatar Aug 17 '22 10:08 draabe

No problem @draabe . Thank you for the update!

crvernon avatar Aug 17 '22 12:08 crvernon

@draabe Thanks very much for the explanation above and apologies for the confusion.

The problem I ran into is that my python version (3.5) was lower than the one required. Therefore when I did pip install the installation defaulted to much lower version of floodlight. I could not access many of the functions decribed in the documentation as a result. Also, it took me a while to figure out what was happening and found it when checking package installations. Hence the multiple issues etc. Also I recommended removing the pip install instructions, since it seems like poetry install was added specifically to address issues such as these. But I see what you mean by retaining the versions on Pypi.

I discussed with the editor as well and the recommended, easiest solution would be to make sure that the pip install always defaults to the latest version of floodlight that is documented. This would ensure that the pip install version and the poetry install versions are consistent. Or deprecate any of the older versions on pypi Or mention in the documentation that this could happen. Let me know what you think of this. Thanks again.

kanishkan91 avatar Aug 17 '22 13:08 kanishkan91

@kanishkan91 No problem and thank you for the explanation, this makes a lot of sense now! For v0.2.0 and v0.2.1 of floodlight, we've had the (lower) requirement for the Python version at >=3.7.1. As some dependencies did not like that, we've updated that requirement to Python >=3.8 starting with the release of v0.3.0. I guess that explains why the installer did fall back to a previous version of floodlight on your install (although I'm not entirely sure why he did that, as you're Python version 3.5 was never actually supported).

In any case, I do appreciate and like the recommended solutions. I've checked PyPI and in fact, they actually have a special option of yanking releases for exactly this scenario, even mentioning our case in the corresponding PEP notes! They recommend yanking over deletion, as this allows users that (have to) run an older version of the package to still get support, while preventing installers falling back to these versions until explicitly directed. Not that we have too many releases or users of floodlight yet, I still think it's better to follow best practices here and keep a clean and accesible version history.

Therefore, I've yanked versions 0.2.0 and 0.2.1 on PyPI. This should remove those older versions from the hierarchy in any new install, and default to the latest available version. The only exception would be if a user explicitly includes e.g. floodlight==0.2.0 as an installer directive. In that case, he can still get that version but pip will additionally show the following warning: "This version still supports Python 3.7. or below, which was deprecated in later versions of floodlight. We recommend updating to a newer version of floodlight and the corresponding Python distribution."

What do you think?

draabe avatar Aug 20 '22 13:08 draabe

@crvernon @kanishkan91 I've addressed all the points raised in the respective issues of the project's issue board. Let me know if you have further remarks or if some points are not to your satisfaction!

draabe avatar Aug 20 '22 17:08 draabe

@draabe Thanks for addressing all of those ! I'l start taking a look and closing all issues. I think most of your responses look good to me. I'l just wait for the merge into master (You mention the changes were currently on the develop branch?) Though on the documentation, I would like the editor to just glance at it before I approve since even though your response makes total sense, I would just like run by them as well.

@crvernon I think I am ready to approve for publication. But could you take a look at this issue here- https://github.com/floodlight-sports/floodlight/issues/87. I think the response from the author makes sense. But just want to make sure you look at this as well.

Otherwise, this is a great software package and excited that this is close to being approved.

kanishkan91 avatar Aug 20 '22 21:08 kanishkan91

@kanishkan91 Thanks a lot for the feedback!

@crvernon All changes resulting from the suggestions by @kanishkan91 and @gagolews are on the develop branch. We usually use the git-flow branching model, staging changes to the package on develop and then releasing a version by merging a bunch of changes into the main branch. If everybody approves, I could take all changes, merge into main, and release v0.3.3. Would that work or is there another preferred way (as technically v0.3.2 is being reviewed)?

draabe avatar Aug 20 '22 21:08 draabe

Using git-flow is fine by me. I'll take a full look at this today now that @kanishkan91 and @gagolews have finished up. Thanks!

crvernon avatar Aug 22 '22 12:08 crvernon

@editorialbot check references

crvernon avatar Aug 22 '22 12:08 crvernon

Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

OK DOIs

- 10.1145/3475722.3482792 is OK
- 10.2165/00007256-200838030-00005 is OK
- 10.1080/02640410903503640 is OK
- 10.48550/ARXIV.1309.0238 is OK
- 10.1007/s40279-014-0144-3 is OK
- 10.1055/a-0592-7660 is OK
- 10.1080/17461391.2020.1747552 is OK
- 10.1038/s41586-020-2649-2 is OK
- 10.1177/1747954119879350 is OK
- 10.1109/MCSE.2007.55 is OK
- 10.1080/02640414.2012.746720 is OK
- 10.25080/Majora-92bf1922-00a is OK
- 10.4324/9781351210164 is OK
- 10.4324/9781003160953 is OK
- 10.1007/s41060-017-0093-7 is OK
- 10.1073/pnas.88.6.2297 is OK
- 10.1186/s40064-016-3108-2 is OK
- 10.1055/s-0031-1301320 is OK
- 10.3390/data2010002 is OK
- 10.1038/s41592-019-0686-2 is OK

MISSING DOIs

- None

INVALID DOIs

- None

editorialbot avatar Aug 22 '22 12:08 editorialbot