jose-reviews icon indicating copy to clipboard operation
jose-reviews copied to clipboard

[REVIEW]: Practical machine learning with PyTorch

Open editorialbot opened this issue 1 year ago • 19 comments

Submitting author: @jatkinson1000 (Jack Atkinson) Repository: https://github.com/Cambridge-ICCS/ml-training-material Branch with paper.md (empty if default branch): JOSE Version: v1.0 Editor: @nicoguaro Reviewers: @mnarayan, @dortiz5 Archive: Pending Paper kind: learning module

Status

status

Status badge code:

HTML: <a href="https://jose.theoj.org/papers/fa9320a02f05c17eafa19d0204a51592"><img src="https://jose.theoj.org/papers/fa9320a02f05c17eafa19d0204a51592/status.svg"></a>
Markdown: [![status](https://jose.theoj.org/papers/fa9320a02f05c17eafa19d0204a51592/status.svg)](https://jose.theoj.org/papers/fa9320a02f05c17eafa19d0204a51592)

Reviewers and authors:

Please avoid lengthy details of difficulties in the review thread. Instead, please create a new issue in the target repository and link to those issues (especially acceptance-blockers) by leaving comments in the review thread below. (For completists: if the target issue tracker is also on GitHub, linking the review thread in the issue or vice versa will create corresponding breadcrumb trails in the link target.)

Reviewer instructions & questions

@mnarayan & @manubastidas, your review will be checklist based. Each of you will have a separate checklist that you should update when carrying out your review. First of all you need to run this command in a separate comment to create the checklist:

@editorialbot generate my checklist

The reviewer guidelines are available here: https://openjournals.readthedocs.io/en/jose/reviewer_guidelines.html. Any questions/concerns please let @nicoguaro know.

Please start on your review when you are able, and be sure to complete your review in the next six weeks, at the very latest

Checklists

📝 Checklist for @mnarayan

📝 Checklist for @dortiz5

editorialbot avatar Mar 03 '24 21:03 editorialbot

Hello humans, I'm @editorialbot, a robot that can help you with some common editorial tasks.

For a list of things I can do to help you, just type:

@editorialbot commands

For example, to regenerate the paper pdf after making changes in the paper's md or bib files, type:

@editorialbot generate pdf

editorialbot avatar Mar 03 '24 21:03 editorialbot

Software report:

github.com/AlDanial/cloc v 1.88  T=0.12 s (277.4 files/s, 275475.6 lines/s)
-------------------------------------------------------------------------------
Language                     files          blank        comment           code
-------------------------------------------------------------------------------
SVG                              3              3              3          17506
CSS                              3           2487              9           5383
Jupyter Notebook                 8              0           4289           1282
Markdown                         5            137              0            512
HTML                             1             53              0            246
TeX                              2             23              0            224
Python                           3             65             98             94
Lua                              1              9              2             73
YAML                             4             13              9             70
JavaScript                       1             15             20             40
Sass                             1             13             11             37
TOML                             1              5              4             37
-------------------------------------------------------------------------------
SUM:                            33           2823           4445          25504
-------------------------------------------------------------------------------


gitinspector failed to run statistical information for the repository

editorialbot avatar Mar 03 '24 21:03 editorialbot

Wordcount for paper.md is 2374

editorialbot avatar Mar 03 '24 21:03 editorialbot

Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

OK DOIs

- 10.1103/RevModPhys.91.045002 is OK
- 10.5281/zenodo.3960218 is OK
- 10.1098/rsta.2020.0093 is OK
- 10.1007/978-3-030-69128-8_12 is OK
- 10.5281/zenodo.5960048 is OK

MISSING DOIs

- None

INVALID DOIs

- None

editorialbot avatar Mar 03 '24 21:03 editorialbot

:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:

editorialbot avatar Mar 03 '24 21:03 editorialbot

@manubastidas, @mnarayan, this is the space where the review process takes form. There is a checklist for each one, tick the boxes when you see that the criterion is satisfied. You can generate the checklist with

@editorialbot generate my checklist

I will be here to answer the questions that you might have.

Let us use as a tentative timeframe the first week of April, is that OK for you?

nicoguaro avatar Mar 03 '24 21:03 nicoguaro

Review checklist for @mnarayan

Conflict of interest

Code of Conduct

General checks

  • [x] Repository: Is the source for this learning module available at the https://github.com/Cambridge-ICCS/ml-training-material?
  • [ ] License: Does the repository contain a plain-text LICENSE file with the contents of a standard license? (OSI-approved for code, Creative Commons for content)
  • [ ] Version: Does the release version given match the repository release?
  • [ ] Authorship: Has the submitting author (@jatkinson1000) made visible contributions to the module? Does the full list of authors seem appropriate and complete?

Documentation

  • [ ] A statement of need: Do the authors clearly state the need for this module and who the target audience is?
  • [ ] Installation instructions: Is there a clearly stated list of dependencies?
  • [ ] Usage: Does the documentation explain how someone would adopt the module, and include examples of how to use it?
  • [ ] Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the module 2) Report issues or problems with the module 3) Seek support

Pedagogy / Instructional design (Work-in-progress: reviewers, please comment!)

  • [ ] Learning objectives: Does the module make the learning objectives plainly clear? (We don't require explicitly written learning objectives; only that they be evident from content and design.)
  • [ ] Content scope and length: Is the content substantial for learning a given topic? Is the length of the module appropriate?
  • [ ] Pedagogy: Does the module seem easy to follow? Does it observe guidance on cognitive load? (working memory limits of 7 +/- 2 chunks of information)
  • [ ] Content quality: Is the writing of good quality, concise, engaging? Are the code components well crafted? Does the module seem complete?
  • [ ] Instructional design: Is the instructional design deliberate and apparent? For example, exploit worked-example effects; effective multi-media use; low extraneous cognitive load.

JOSE paper

  • [ ] Authors: Does the paper.md file include a list of authors with their affiliations?
  • [ ] A statement of need: Does the paper clearly state the need for this module and who the target audience is?
  • [ ] Description: Does the paper describe the learning materials and sequence?
  • [ ] Does it describe how it has been used in the classroom or other settings, and how someone might adopt it?
  • [ ] Could someone else teach with this module, given the right expertise?
  • [ ] Does the paper tell the "story" of how the authors came to develop it, or what their expertise is?
  • [ ] References: Do all archival references that should have a DOI list one (e.g., papers, datasets, software)?

mnarayan avatar Mar 05 '24 17:03 mnarayan

Hi All, and thanks for volunteering to edit/review. Please do let me know if you have any questions or if I can help at all.

We have just delivered another workshop using this material this week.

jatkinson1000 avatar Mar 26 '24 13:03 jatkinson1000

Hello @manubastidas and @mnarayan, do you have any advances regarding this review? If we can help you with something please let us know.

nicoguaro avatar Apr 16 '24 16:04 nicoguaro

I think that @manubastidas is not able to continue the review with us for personal reasons. Thank you for your disposition and I hope to count with you in future opportunities.

@dortiz5 has accepted to help us with the review.

nicoguaro avatar Apr 18 '24 18:04 nicoguaro

@editorialbot remove @manubastidas from reviewers

nicoguaro avatar Apr 18 '24 18:04 nicoguaro

@manubastidas removed from the reviewers list!

editorialbot avatar Apr 18 '24 18:04 editorialbot

@editorialbot add @dortiz5 to reviewers

nicoguaro avatar Apr 18 '24 18:04 nicoguaro

@dortiz5 added to the reviewers list!

editorialbot avatar Apr 18 '24 18:04 editorialbot

@dortiz5, this is the space where the review process takes form. There is a checklist for each one, tick the boxes when you see that the criterion is satisfied. You can generate the checklist with

@editorialbot generate my checklist

I will be here to answer the questions that you might have.

Let us use as a tentative timeframe the second week of May, is that OK for you?

nicoguaro avatar Apr 18 '24 19:04 nicoguaro

Review checklist for @dortiz5

Conflict of interest

Code of Conduct

General checks

  • [x] Repository: Is the source for this learning module available at the https://github.com/Cambridge-ICCS/ml-training-material?
  • [x] License: Does the repository contain a plain-text LICENSE file with the contents of a standard license? (OSI-approved for code, Creative Commons for content)
  • [x] Version: Does the release version given match the repository release?
  • [x] Authorship: Has the submitting author (@jatkinson1000) made visible contributions to the module? Does the full list of authors seem appropriate and complete?

Documentation

  • [x] A statement of need: Do the authors clearly state the need for this module and who the target audience is?
  • [x] Installation instructions: Is there a clearly stated list of dependencies?
  • [x] Usage: Does the documentation explain how someone would adopt the module, and include examples of how to use it?
  • [x] Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the module 2) Report issues or problems with the module 3) Seek support

Pedagogy / Instructional design (Work-in-progress: reviewers, please comment!)

  • [x] Learning objectives: Does the module make the learning objectives plainly clear? (We don't require explicitly written learning objectives; only that they be evident from content and design.)
  • [x] Content scope and length: Is the content substantial for learning a given topic? Is the length of the module appropriate?
  • [x] Pedagogy: Does the module seem easy to follow? Does it observe guidance on cognitive load? (working memory limits of 7 +/- 2 chunks of information)
  • [x] Content quality: Is the writing of good quality, concise, engaging? Are the code components well crafted? Does the module seem complete?
  • [x] Instructional design: Is the instructional design deliberate and apparent? For example, exploit worked-example effects; effective multi-media use; low extraneous cognitive load.

JOSE paper

  • [x] Authors: Does the paper.md file include a list of authors with their affiliations?
  • [x] A statement of need: Does the paper clearly state the need for this module and who the target audience is?
  • [x] Description: Does the paper describe the learning materials and sequence?
  • [x] Does it describe how it has been used in the classroom or other settings, and how someone might adopt it?
  • [x] Could someone else teach with this module, given the right expertise?
  • [x] Does the paper tell the "story" of how the authors came to develop it, or what their expertise is?
  • [x] References: Do all archival references that should have a DOI list one (e.g., papers, datasets, software)?

dortiz5 avatar May 06 '24 16:05 dortiz5

Thanks for looking at this @dortiz5 !

I have just opened a Pull Request to add contribution guidelines here.

Once that has been reviewed I'll merge it to main, and then rebase the paper branch off of main and let you know.

jatkinson1000 avatar May 06 '24 19:05 jatkinson1000

@jatkinson1000, thanks to you for the repository.

It is nicely written, ordered, and storytelling, with clear instructions for preparation and prerequisites to run the exercises. Also, the local installation was easy. I followed the four simple steps you described, and it works correctly.

From the pedagogical standpoint, learning objectives are explicitly written and aligned with the content in the slides and the exercises. I liked that you gave different options to run the exercises and gave their solutions. Also, the information is concise and easy to follow. Finally, I found that the JOSE paper follows the checklist.

dortiz5 avatar May 06 '24 21:05 dortiz5

Hi @dortiz5 There should now be contribution guidelines on the main branch and the paper branch.

I'll look at sorting a version and let you know once that's done.

@mnarayan @nicoguaro please do let me know if there is anything I can do to help you. Thanks all!

jatkinson1000 avatar May 09 '24 17:05 jatkinson1000

@dortiz5, thanks for the review. Would you recommend the work for publication?

@mnarayan, can you still provide a review for this submission? Please let us know if we can do something to help you move forward.

nicoguaro avatar May 11 '24 15:05 nicoguaro

yes! I recommend the work for publication.

dortiz5 avatar May 13 '24 15:05 dortiz5

@dortiz5 Thank you so much for your time and kind words.

@nicoguaro and @mnarayan please let us know if there is anything at all I can do to get this over the finish line and accepted, especially after @dortiz5's positive review. We are delivering this course again in July and it would be great to be able to feature/link to JOSE at the workshop.

jatkinson1000 avatar May 17 '24 10:05 jatkinson1000

@editorialbot add @pdpino to reviewers

nicoguaro avatar May 19 '24 03:05 nicoguaro

@pdpino added to the reviewers list!

editorialbot avatar May 19 '24 03:05 editorialbot

@pdpino, this is the space where the review process takes form. There is a checklist for each one, tick the boxes when you see that the criterion is satisfied. You can generate the checklist with

@editorialbot generate my checklist

I will be here to answer the questions that you might have.

Let us use as a tentative timeframe the end of May, is that OK for you?

nicoguaro avatar May 19 '24 03:05 nicoguaro

@pdpino Thank you so much for agreeing to review our work. Please do let us know if you have any questions, or if there is anything we can do to help you here. :)

jatkinson1000 avatar May 29 '24 16:05 jatkinson1000

Review checklist for @pdpino

Conflict of interest

Code of Conduct

General checks

  • [x] Repository: Is the source for this learning module available at the https://github.com/Cambridge-ICCS/ml-training-material?
  • [x] License: Does the repository contain a plain-text LICENSE file with the contents of a standard license? (OSI-approved for code, Creative Commons for content)
  • [x] Version: Does the release version given match the repository release?
  • [x] Authorship: Has the submitting author (@jatkinson1000) made visible contributions to the module? Does the full list of authors seem appropriate and complete?

Documentation

  • [x] A statement of need: Do the authors clearly state the need for this module and who the target audience is?
  • [x] Installation instructions: Is there a clearly stated list of dependencies?
  • [x] Usage: Does the documentation explain how someone would adopt the module, and include examples of how to use it?
  • [x] Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the module 2) Report issues or problems with the module 3) Seek support

Pedagogy / Instructional design (Work-in-progress: reviewers, please comment!)

  • [x] Learning objectives: Does the module make the learning objectives plainly clear? (We don't require explicitly written learning objectives; only that they be evident from content and design.)
  • [x] Content scope and length: Is the content substantial for learning a given topic? Is the length of the module appropriate?
  • [x] Pedagogy: Does the module seem easy to follow? Does it observe guidance on cognitive load? (working memory limits of 7 +/- 2 chunks of information)
  • [x] Content quality: Is the writing of good quality, concise, engaging? Are the code components well crafted? Does the module seem complete?
  • [x] Instructional design: Is the instructional design deliberate and apparent? For example, exploit worked-example effects; effective multi-media use; low extraneous cognitive load.

JOSE paper

  • [x] Authors: Does the paper.md file include a list of authors with their affiliations?
  • [x] A statement of need: Does the paper clearly state the need for this module and who the target audience is?
  • [x] Description: Does the paper describe the learning materials and sequence?
  • [x] Does it describe how it has been used in the classroom or other settings, and how someone might adopt it?
  • [x] Could someone else teach with this module, given the right expertise?
  • [x] Does the paper tell the "story" of how the authors came to develop it, or what their expertise is?
  • [x] References: Do all archival references that should have a DOI list one (e.g., papers, datasets, software)?

pdpino avatar May 29 '24 20:05 pdpino

Hello everyone, sorry for the late reply, I finally found some time to check this out.

Overall it looks really good! I think the slides and exercises are very pedagogic. I left some minor suggestions and changes (https://github.com/Cambridge-ICCS/ml-training-material/issues/53, https://github.com/Cambridge-ICCS/ml-training-material/pull/54, https://github.com/Cambridge-ICCS/ml-training-material/pull/55, https://github.com/Cambridge-ICCS/ml-training-material/issues/56), but these are non-blocking and I recommend the work for publication.

@nicoguaro what does the point about Version mean in this case? There are no releases in this repository, so I'd say it doesn't apply in this case. @jatkinson1000 though I might suggest you version your releases, in case you want to change this course in the future.

pdpino avatar May 29 '24 22:05 pdpino

Thanks @pdpino for your kind words.

I have accepted your contributions, and opened a couple of PRs for the issues you raised.

Once these are merged I will apply a version number and release tag ready for publication!

jatkinson1000 avatar May 30 '24 08:05 jatkinson1000

@pdpino, thanks for your review.

@jatkinson1000, please let me know when you have the release version and have addressed the main issues so we can move forward.

nicoguaro avatar May 30 '24 13:05 nicoguaro