joss-reviews
joss-reviews copied to clipboard
[REVIEW]: Mantik: A Workflow Platform for the Development of Artificial Intelligence on High-Performance Computing Infrastructures
Submitting author: @rico-berner (Rico Berner) Repository: https://gitlab.com/mantik-ai/mantik Branch with paper.md (empty if default branch): Version: 0.4.2 Editor: @arfon Reviewers: @zhaozhang, @gflofst, @acrlakshman Archive: Pending
Status
Status badge code:
HTML: <a href="https://joss.theoj.org/papers/31e03b578e512eb5565baca1a21fe268"><img src="https://joss.theoj.org/papers/31e03b578e512eb5565baca1a21fe268/status.svg"></a>
Markdown: [](https://joss.theoj.org/papers/31e03b578e512eb5565baca1a21fe268)
Reviewers and authors:
Please avoid lengthy details of difficulties in the review thread. Instead, please create a new issue in the target repository and link to those issues (especially acceptance-blockers) by leaving comments in the review thread below. (For completists: if the target issue tracker is also on GitHub, linking the review thread in the issue or vice versa will create corresponding breadcrumb trails in the link target.)
Reviewer instructions & questions
@zhaozhang, your review will be checklist based. Each of you will have a separate checklist that you should update when carrying out your review. First of all you need to run this command in a separate comment to create the checklist:
@editorialbot generate my checklist
The reviewer guidelines are available here: https://joss.readthedocs.io/en/latest/reviewer_guidelines.html. Any questions/concerns please let @arfon know.
β¨ Please start on your review when you are able, and be sure to complete your review in the next six weeks, at the very latest β¨
Checklists
Hello humans, I'm @editorialbot, a robot that can help you with some common editorial tasks.
For a list of things I can do to help you, just type:
@editorialbot commands
For example, to regenerate the paper pdf after making changes in the paper's md or bib files, type:
@editorialbot generate pdf
Reference check summary (note 'MISSING' DOIs are suggestions that need verification):
OK DOIs
- 10.5194/egusphere-egu21-9632 is OK
- 10.1145/3399579.3399867 is OK
- 10.1109/ACCESS.2023.3262138 is OK
MISSING DOIs
- None
INVALID DOIs
- None
Software report:
github.com/AlDanial/cloc v 1.88 T=0.19 s (1435.1 files/s, 113942.3 lines/s)
---------------------------------------------------------------------------------------
Language files blank comment code
---------------------------------------------------------------------------------------
Python 194 2451 1303 12424
Markdown 47 1184 0 2909
YAML 5 50 10 526
JSON 9 5 0 219
TOML 4 28 4 188
make 2 18 4 71
Jupyter Notebook 1 0 106 57
TeX 1 2 0 32
HTML 1 0 0 29
reStructuredText 2 27 34 27
CSS 1 1 0 24
SVG 3 0 0 9
Bourne Shell 2 0 1 7
Dockerfile 1 1 0 2
Windows Module Definition 1 0 0 2
---------------------------------------------------------------------------------------
SUM: 274 3767 1462 16526
---------------------------------------------------------------------------------------
gitinspector failed to run statistical information for the repository
Wordcount for paper.md is 1410
:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:
@zhaozhang β This is the review thread for the paper. All of our communications will happen here from now on.
Please read the "Reviewer instructions & questions" in the first comment above. Please create your checklist typing:
@editorialbot generate my checklist
As you go over the submission, please check any items that you feel have been satisfied. There are also links to the JOSS reviewer guidelines.
The JOSS review is different from most other journals. Our goal is to work with the authors to help them meet our criteria instead of merely passing judgment on the submission. As such, the reviewers are encouraged to submit issues and pull requests on the software repository. When doing so, please mention https://github.com/openjournals/joss-reviews/issues/6136 so that a link is created to this thread (and I can keep an eye on what is happening). Please also feel free to comment and ask questions on this thread. In my experience, it is better to post comments/questions/suggestions as you come across them instead of waiting until you've reviewed the entire package.
We aim for the review process to be completed within about 4-6 weeks but please make a start well ahead of this as JOSS reviews are by their nature iterative and any early feedback you may be able to provide to the author will be very helpful in meeting this schedule.
@dghoshal-lbl @cc-a @ian-taylor @gflofst @thurber @acrlakshman β :wave: would you be willing to review this submission for JOSS? The submission under consideration is Mantik: A Workflow Platform for the Development of Artificial Intelligence on High-Performance Computing Infrastructures
The review process at JOSS is unique: it takes place in a GitHub issue, is open, and author-reviewer-editor conversations are encouraged. You can learn more about the process in these guidelines: https://joss.readthedocs.io/en/latest/reviewer_guidelines.html
Based on your experience, and past submissions to JOSS, we think you might be able to provide a great review of this submission. Please let me know if you think you can help us out!
Many thanks Arfon
Sure. That's squarely in my area.
From: Arfon Smith @.> Sent: Saturday, December 9, 2023 10:02:44 PM To: openjournals/joss-reviews @.> Cc: Lofstead, Jay F @.>; Mention @.> Subject: [EXTERNAL] Re: [openjournals/joss-reviews] [REVIEW]: Mantik: A Workflow Platform for the Development of Artificial Intelligence on High-Performance Computing Infrastructures (Issue #6136)
@dghoshal-lblhttps://github.com/dghoshal-lbl @cc-ahttps://github.com/cc-a @ian-taylorhttps://github.com/ian-taylor @gflofsthttps://github.com/gflofst @thurberhttps://github.com/thurber @acrlakshmanhttps://github.com/acrlakshman β π would you be willing to review this submission for JOSS? The submission under consideration is Mantik: A Workflow Platform for the Development of Artificial Intelligence on High-Performance Computing Infrastructures
The review process at JOSS is unique: it takes place in a GitHub issue, is open, and author-reviewer-editor conversations are encouraged. You can learn more about the process in these guidelines: https://joss.readthedocs.io/en/latest/reviewer_guidelines.html
Based on your experience, and past submissions to JOSS, we think you might be able to provide a great review of this submission. Please let me know if you think you can help us out!
Many thanks Arfon
β Reply to this email directly, view it on GitHubhttps://github.com/openjournals/joss-reviews/issues/6136#issuecomment-1848658768, or unsubscribehttps://github.com/notifications/unsubscribe-auth/AAUR4WV6PWR43MRASFA37Z3YIS7WJAVCNFSM6AAAAABAOAE56WVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTQNBYGY2TQNZWHA. You are receiving this because you were mentioned.Message ID: @.***>
@dghoshal-lbl @cc-a @ian-taylor @gflofst @thurber @acrlakshman β :wave: would you be willing to review this submission for JOSS? The submission under consideration is Mantik: A Workflow Platform for the Development of Artificial Intelligence on High-Performance Computing Infrastructures
The review process at JOSS is unique: it takes place in a GitHub issue, is open, and author-reviewer-editor conversations are encouraged. You can learn more about the process in these guidelines: https://joss.readthedocs.io/en/latest/reviewer_guidelines.html
Based on your experience, and past submissions to JOSS, we think you might be able to provide a great review of this submission. Please let me know if you think you can help us out!
Many thanks Arfon
Sure.
@editorialbot add @gflofst as reviewer
@gflofst added to the reviewers list!
@editorialbot add @acrlakshman as reviewer
@acrlakshman added to the reviewers list!
@gflofst, @acrlakshman β This is the review thread for the paper. All of our communications will happen here from now on.
Please read the "Reviewer instructions & questions" in the first comment above. Please create your checklist typing:
@editorialbot generate my checklist
As you go over the submission, please check any items that you feel have been satisfied. There are also links to the JOSS reviewer guidelines.
The JOSS review is different from most other journals. Our goal is to work with the authors to help them meet our criteria instead of merely passing judgment on the submission. As such, the reviewers are encouraged to submit issues and pull requests on the software repository. When doing so, please mention https://github.com/openjournals/joss-reviews/issues/6136 so that a link is created to this thread (and I can keep an eye on what is happening). Please also feel free to comment and ask questions on this thread. In my experience, it is better to post comments/questions/suggestions as you come across them instead of waiting until you've reviewed the entire package.
We aim for the review process to be completed within about 4-6 weeks but please make a start well ahead of this as JOSS reviews are by their nature iterative and any early feedback you may be able to provide to the author will be very helpful in meeting this schedule.
Review checklist for @gflofst
Conflict of interest
- [x] I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.
Code of Conduct
- [x] I confirm that I read and will adhere to the JOSS code of conduct.
General checks
- [x] Repository: Is the source code for this software available at the https://gitlab.com/mantik-ai/mantik?
- [x] License: Does the repository contain a plain-text LICENSE or COPYING file with the contents of an OSI approved software license?
- [x] Contribution and authorship: Has the submitting author (@rico-berner) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?
- [x] Substantial scholarly effort: Does this submission meet the scope eligibility described in the JOSS guidelines
- [x] Data sharing: If the paper contains original data, data are accessible to the reviewers. If the paper contains no original data, please check this item.
- [x] Reproducibility: If the paper contains original results, results are entirely reproducible by reviewers. If the paper contains no original results, please check this item.
- [x] Human and animal research: If the paper contains original data research on humans subjects or animals, does it comply with JOSS's human participants research policy and/or animal research policy? If the paper contains no such data, please check this item.
Functionality
- [x] Installation: Does installation proceed as outlined in the documentation?
- [x] Functionality: Have the functional claims of the software been confirmed?
- [x] Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)
Documentation
- [x] A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
- [x] Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
- [x] Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
- [x] Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
- [x] Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
- [x] Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support
Software paper
- [x] Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
- [x] A statement of need: Does the paper have a section titled 'Statement of need' that clearly states what problems the software is designed to solve, who the target audience is, and its relation to other work?
- [x] State of the field: Do the authors describe how this software compares to other commonly-used packages?
- [x] Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
- [x] References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?
:wave: @acrlakshman βΒ just checking in here. Do you think you might be able to start your review soon?
:wave: @acrlakshman βΒ just checking in here. Do you think you might be able to start your review soon?
I shall be working on it this week and provide an update.
Review checklist for @zhaozhang
Conflict of interest
- [x] I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.
Code of Conduct
- [x] I confirm that I read and will adhere to the JOSS code of conduct.
General checks
- [x] Repository: Is the source code for this software available at the https://gitlab.com/mantik-ai/mantik?
- [x] License: Does the repository contain a plain-text LICENSE or COPYING file with the contents of an OSI approved software license?
- [x] Contribution and authorship: Has the submitting author (@rico-berner) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?
- [x] Substantial scholarly effort: Does this submission meet the scope eligibility described in the JOSS guidelines
- [x] Data sharing: If the paper contains original data, data are accessible to the reviewers. If the paper contains no original data, please check this item.
- [x] Reproducibility: If the paper contains original results, results are entirely reproducible by reviewers. If the paper contains no original results, please check this item.
- [x] Human and animal research: If the paper contains original data research on humans subjects or animals, does it comply with JOSS's human participants research policy and/or animal research policy? If the paper contains no such data, please check this item.
Functionality
- [x] Installation: Does installation proceed as outlined in the documentation?
- [x] Functionality: Have the functional claims of the software been confirmed?
- [x] Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)
Documentation
- [x] A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
- [x] Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
- [x] Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
- [x] Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
- [x] Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
- [x] Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support
Software paper
- [x] Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
- [x] A statement of need: Does the paper have a section titled 'Statement of need' that clearly states what problems the software is designed to solve, who the target audience is, and its relation to other work?
- [x] State of the field: Do the authors describe how this software compares to other commonly-used packages?
- [x] Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
- [x] References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?
Review checklist for @acrlakshman
Conflict of interest
- [x] I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.
Code of Conduct
- [x] I confirm that I read and will adhere to the JOSS code of conduct.
General checks
- [x] Repository: Is the source code for this software available at the https://gitlab.com/mantik-ai/mantik?
- [x] License: Does the repository contain a plain-text LICENSE or COPYING file with the contents of an OSI approved software license?
- [x] Contribution and authorship: Has the submitting author (@rico-berner) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?
- [ ] Substantial scholarly effort: Does this submission meet the scope eligibility described in the JOSS guidelines
- [ ] Data sharing: If the paper contains original data, data are accessible to the reviewers. If the paper contains no original data, please check this item.
- [ ] Reproducibility: If the paper contains original results, results are entirely reproducible by reviewers. If the paper contains no original results, please check this item.
- [ ] Human and animal research: If the paper contains original data research on humans subjects or animals, does it comply with JOSS's human participants research policy and/or animal research policy? If the paper contains no such data, please check this item.
Functionality
- [ ] Installation: Does installation proceed as outlined in the documentation?
- [ ] Functionality: Have the functional claims of the software been confirmed?
- [ ] Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)
Documentation
- [ ] A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
- [ ] Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
- [ ] Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
- [ ] Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
- [ ] Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
- [ ] Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support
Software paper
- [ ] Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
- [ ] A statement of need: Does the paper have a section titled 'Statement of need' that clearly states what problems the software is designed to solve, who the target audience is, and its relation to other work?
- [ ] State of the field: Do the authors describe how this software compares to other commonly-used packages?
- [ ] Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
- [ ] References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?
@arfon I need to go through the technical details. At first glance, the submitting author (@rico-berner) did not make significant contributions to the software, however the author list looks complete and appropriately ordered. How would you prefer me handle the relevant checkbox in the general checks section.
@arfon I need to go through the technical details. At first glance, the submitting author (@rico-berner) did not make significant contributions to the software, however the author list looks complete and appropriately ordered. How would you prefer me handle the relevant checkbox in the
general checkssection.
@arfon @acrlakshman Thank you for handling and reviewing our paper. If I may, I would like to note that I have been involved foremost in the conception of the software features and coordination of development activities. I hope this helps you evaluating this point from the check list.
@arfon I need to go through the technical details. At first glance, the submitting author (@rico-berner) did not make significant contributions to the software, however the author list looks complete and appropriately ordered. How would you prefer me handle the relevant checkbox in the general checks section.
Thanks for flagging this @acrlakshman. It is not a requirement that all authors are major contributors to the actual code authoring part of the software. JOSS recognizes that there are many ways for individuals to contribute to the authorship of a work. To answer your question though, I believe you can mark that check list item as 'checked'.
There isn't dependency information, e.g., the minimum Python version.
In installed Mantik on TACC Frontera with Python 3.9.2. The installation was ok, but it reported this error:
We recommend you use --use-feature=2020-resolver to test your packages with the new resolver before it becomes the default.
werkzeug 3.0.1 requires MarkupSafe>=2.1.1, but you'll have markupsafe 2.0.1 which is incompatible.
sqlalchemy 2.0.25 requires typing-extensions>=4.6.0, but you'll have typing-extensions 4.5.0 which is incompatible.```
I am following this tutorial to run a project (https://mantik-ai.gitlab.io/mantik/tutorials/mantik-minimal-project/02.html#start-a-run). Then I run into an error of Mantik not being able to find the tutorial repo. Please see attached.
There isn't dependency information, e.g., the minimum Python version.
In installed Mantik on TACC Frontera with Python 3.9.2. The installation was ok, but it reported this error:
We recommend you use --use-feature=2020-resolver to test your packages with the new resolver before it becomes the default. werkzeug 3.0.1 requires MarkupSafe>=2.1.1, but you'll have markupsafe 2.0.1 which is incompatible. sqlalchemy 2.0.25 requires typing-extensions>=4.6.0, but you'll have typing-extensions 4.5.0 which is incompatible.```
Hi, the information on the python version is provided here: https://pypi.org/project/mantik/. regarding the installation, what is the command you have used for the installation?
I am following this tutorial to run a project (https://mantik-ai.gitlab.io/mantik/tutorials/mantik-minimal-project/02.html#start-a-run). Then I run into an error of Mantik not being able to find the tutorial repo. Please see attached.
![]()
Going through the tutorial is of course exactly how it should work to get familiar with Mantik. Mantik is a platform that is actively under development. Unfortunately, the tutorial documentation got a bit outdated and we noticed a bug in the backend that causes problems with using gitlab repositories. Both bugs will be fixed soon. The bug report can be found here: https://gitlab.com/mantik-ai/mantik-api/-/issues/266 and https://gitlab.com/mantik-ai/mantik-api/-/issues/267
In order to make the minimal tutorial work you could use the following github repository https://github.com/fabian4cast/minimal-mantik-project and provide instead of the path to the file the path to the directory where the MLproject file is located. For the new repo is should read simply "mlproject".
@arfon I need to go through the technical details. At first glance, the submitting author (@rico-berner) did not make significant contributions to the software, however the author list looks complete and appropriately ordered. How would you prefer me handle the relevant checkbox in the general checks section.
Thanks for flagging this @acrlakshman. It is not a requirement that all authors are major contributors to the actual code authoring part of the software. JOSS recognizes that there are many ways for individuals to contribute to the authorship of a work. To answer your question though, I believe you can mark that check list item as 'checked'.
Thank you for the clarification! @acrlakshman Would you need further information for the review?
@arfon I need to go through the technical details. At first glance, the submitting author (@rico-berner) did not make significant contributions to the software, however the author list looks complete and appropriately ordered. How would you prefer me handle the relevant checkbox in the general checks section.
Thanks for flagging this @acrlakshman. It is not a requirement that all authors are major contributors to the actual code authoring part of the software. JOSS recognizes that there are many ways for individuals to contribute to the authorship of a work. To answer your question though, I believe you can mark that check list item as 'checked'.
Thank you for the clarification! @acrlakshman Would you need further information for the review?
No further clarification needed. I am currently on vacation so could not get back to it as soon as I expected. I will submit mine asap.
I tried to run after a seemingly successful install (Windows 11) and I get this: requests.exceptions.HTTPError: 401 Client Error: Unauthorized for url: https://api.cloud.mantik.ai/mantik/tokens/create Traceback (most recent call last): File "C:\Users\gflofst\AppData\Local\Programs\Python\Python312\Lib\site-packages\mantik\utils\mantik_api\client.py", line 76, in send_request_to_mantik_api response.raise_for_status() File "C:\Users\gflofst\AppData\Local\Programs\Python\Python312\Lib\site-packages\requests\models.py", line 1021, in raise_for_status raise HTTPError(http_error_msg, response=self) requests.exceptions.HTTPError: 401 Client Error: Unauthorized for url: https://api.cloud.mantik.ai/mantik/tokens/create
My password has a * character in it. That might be the problem.
references are not rendering with the markdown file