torchquad icon indicating copy to clipboard operation
torchquad copied to clipboard

Start investigating test coverage

Open gomezzz opened this issue 2 years ago • 11 comments

Feature

Desired Behavior / Functionality

Currently, there are several tests for a multitude of components of torchquad. However, we have never investigated test coverage. This may be desirable, especially, as when the project grows over time we are bound to face regressions if there is a large number of untested parts.

What Needs to Be Done

  • [x] Find a good way to investigate test coverage of current tests.
  • [ ] Implement and add to docs how it can be used
  • [ ] Add some feedback on test coverage to CI if it makes sense
  • [ ] Improve test coverage (optional)

gomezzz avatar Aug 20 '21 13:08 gomezzz

Hi Pablo,

I have read your current test suite and would be interested in picking up this issue.

Would you be open to a pytest implementation of the tests? This would make investigation of coverage much easier and all tests could be run within pytest as opposed to being called in a if name section at the end of each test script.

Cheers,

George

wirrell avatar Apr 13 '22 17:04 wirrell

Hi @wirrell ,

great to hear! Go ahead :)

Would you be open to a pytest implementation of the tests?

We are already using pytest in fact! The if main stuff is just for convenience in case you want to try out an individual test without pytest. You can run pytest in the tests folder and it is what we use in the CI as well.

gomezzz avatar Apr 14 '22 12:04 gomezzz

OK, great. I'll get started.

wirrell avatar Apr 15 '22 01:04 wirrell

Here is the coverage report for the main branch, as generated by coverage.py.

image

and coverage for the current develop branch with all backends (some tests currently failing).

image

Attached are html reports which show coverage line-by-line. coverage_developbranch.zip coverage_mainbranch.zip

wirrell avatar Apr 15 '22 14:04 wirrell

@gomezzz What did you have in mind for a docs addition? Something like an additional .rst page with a quick tutorial on using coverage.py to investigate coverage?

wirrell avatar Apr 15 '22 14:04 wirrell

Oh, nice reports! Thanks! :) Seems, we lost quite some coverage in some integrators in develop 🙈

What did you have in mind for a docs addition? Something like an additional .rst page with a quick tutorial on using coverage.py to investigate coverage?

Maybe, we could add a Github Action to automatically compute this? Just found a nice example to run it and automatically create a comment on new PRs about changes in coverage similar to this https://github.com/ewjoachim/python-coverage-comment-action-example/pull/2#issuecomment-1003646299 ?

Alternatively, short instruction in the docs would also work but automating may be more convenient? :)

gomezzz avatar Apr 15 '22 19:04 gomezzz

Sounds good. I'll look into implementing a Github Action.

wirrell avatar Apr 21 '22 16:04 wirrell

@gomezzz Here's an update on this.

I have two actions running that generates coverage reports and posts a comment on PR. There was a clash between the way the current tests are written to import the source code, how coverage.py writes coverage reports, and the way the python-coverage-comment-action reads those reports that needed to be circumvented using a .cfg file. Details on that below.

The new actions are get_test_coverage_on_push_or_PR.yml post_coverage_comment.yml

As well as a new setup.cfg which has config options for coverage.py and pytest that are required to run tests in a way that can interact with python-coverage-comment.

Here is an example PR with coverage comment.

One final note: The coverage generator runs all the tests that the existing run_tests.yml action runs. Could be some scope to merge these two but I have kept them separate for now.

Coverage Reports clash info

The clash comes from the fact that currently, pytest must be run in the torchquad/tests/ directory, because tests are written to import source code using sys.append to make source code available within a test, e.g. in vegas_test.py

import sys

sys.path.append("../")

This way of doing things (versus full imports e.g. from torchquad.integration.vegas) means that we have to run pytest and generate reports in the test directory, but the python-coverage-comment action only reads .coverage files in the top directory without any option to change that behavior.

To get around this, I have added a step in the new action that installs conda build and adds the source directory to the file path, so that we can run pytest in top directory. There is also a new setup.cfg file which direct pytest and coverage to run in way conducive to python-coverage-comment.

wirrell avatar Jun 05 '22 18:06 wirrell

@gomezzz Also please ignore the PR I just submitted. Was a mistake while creating an example PR in my fork.

Let me know thought on the use of setup.cfg or if you think I could structure this any better.

wirrell avatar Jun 05 '22 18:06 wirrell

Hi @wirrell ,

thanks first off for the nice work, looks pretty neat!

post_coverage_comment.yml

that link doesn't work for me , 404 :)

The setup.cfg is solution is fine, I would just maybe rename the file if possible to make clear this is only for the tests, so maybe test_setup.cfg?

Here is an https://github.com/wirrell/torchquad/pull/5

Looks good! Would just change some strings, call it Test coverage report

The coverage rate is 75% The branch rate is 68%

I presume coverage rate is on original code before PR? So maybe call it 'Old overall test coverage rate' or something? Similarly, branch, I reckon, refers to current branch? So maybe 'PR branch test coverage'?

The coverage generator runs all the tests that the existing run_tests.yml action runs. Could be some scope to merge these two but I have kept them separate for now.

If you have an elegant solution in mind that avoids running the tests twice (does the coverage action fail if the tests fail? that would be easiest, I guess?) we could go for that?

The clash comes from the fact that currently, pytest must be run in the torchquad/tests/ directory, because tests are written to import source code using sys.append to make source code available within a test, e.g. in vegas_test.py

Yea this is a bit of a mistake we made early on with torchquad. If you have an easy fix for it in mind we could also go for that. Otherwise, your solution is also ok! (but maybe we create a separate issue to rework the imports in the tests so that this can be remedied? :thinking: )

Overall, feel free to create a PR for this, then I will have a look and give more concrete feedback!

Thanks!

gomezzz avatar Jun 07 '22 09:06 gomezzz

@gomezzz OK, great. I'll look into implementing these string changes and the file rename and then create a PR.

Re: tests running twice and imports - separate issues for both of those sounds good. I'd be happy to get to them after finishing this one.

wirrell avatar Jun 15 '22 21:06 wirrell