scicm icon indicating copy to clipboard operation
scicm copied to clipboard

Submission to pyOpenSci

Open MBravoS opened this issue 2 years ago • 7 comments

I've been looking into the idea we had with @AstroRobin some time ago of submitting this package for publication. If I remember correctly, our original idea was to submit it to the Journal of Open Source Software, but looking into their stated scope I think SciCM wouldn't pass their selection criteria. They suggest an alternative in their detailed documentation, pyOpenSci, and looking into their scope I think this is the better option for SciCM. They do have some requirements that the package needs to meet, so for the moment I'm working from their (currently incomplete) package guide. I think this should be the next goal for SciCM, and once submitted (or ideally approved) we can move on with #8 and #9.

MBravoS avatar Dec 19 '22 21:12 MBravoS

@AstroRobin, what do you think of the updated README? It's missing the continuous integration badge... because we are missing continuous integration in SciCM, but everything else should be there.

MBravoS avatar Dec 19 '22 23:12 MBravoS

Looks good, just had some minor corrections made which should be able to be pulled in.

AstroRobin avatar Dec 20 '22 02:12 AstroRobin

I started working on the CI on the devel branch, using GitHub Actions for the checks (which you can see in the Actions tab of the repo). Currently it's only testing if the code passes a lint check with flake8, which is what GitHub implicitly suggests, though I already added the badge to the README. I still need to add proper tests before moving into the master branch.

Note that in the process I also pruned the number of branches in the repo.

MBravoS avatar Jan 12 '23 00:01 MBravoS

Ok, I've gotten a handle of how to do the testing (see test-scicm.yml and test.py in commit 20f4d67), but now the question is: what could be good tests to include? What to me seems like obvious things to test are:

  1. If the functions in the tool submodule work as expected, which would require checking: 1.1. If the functions work without raising an error under "typical" inputs. 1.2. If the functions generate the expected output from them.
  2. If the colour maps are being loaded with the correct names.

@AstroRobin ideas on this would be more than welcome.

MBravoS avatar Mar 22 '23 20:03 MBravoS

Hard to know how you would implement 1.2? How do you know what the correct answer for say trimming a colourmap is? Perhaps it looks something like checking that the hex code of the trimmed cmap is identical to the value given by extracting the colour at a given fraction?

Names is important, but that's more of a bookkeeping problem :)

AstroRobin avatar Mar 23 '23 05:03 AstroRobin

Hard to know how you would implement 1.2? How do you know what the correct answer for say trimming a colourmap is? Perhaps it looks something like checking that the hex code of the trimmed cmap is identical to the value given by extracting the colour at a given fraction?

What I as thinking was testing with simple mock examples, e.g., testing the crop function with a "fake" colour map with 6 colours where we can compare the 4-colour output to the expected 4-colour map.

Names is important, but that's more of a bookkeeping problem :)

Fair enough.

MBravoS avatar Mar 27 '23 21:03 MBravoS

It's taken a long time, but I think we are finished with the internal testing and source documentation 🥳. With that done, I think that all we need to do to complete the pyOpenSci checklist would be to:

MBravoS avatar Jul 05 '24 23:07 MBravoS