gama icon indicating copy to clipboard operation
gama copied to clipboard

Increase code unit test coverage

Open PGijsbers opened this issue 7 years ago • 5 comments

Not all code is currently covered by unit and/or system tests. In some cases, this does not matter (e.g. not all ValueError scenarios need to be automatically checked, I think), but for other functionality still needs coverage (e.g. time-out behavior in evaluation.py).

PGijsbers avatar Oct 02 '18 12:10 PGijsbers

In particular coverage from unit tests need to expand. With e.g. mocking ensembling and search algorithms can also be included in quick tests. Missing coverage:

  • ~score~
  • cross_val_predict_score

PGijsbers avatar Jul 09 '19 21:07 PGijsbers

It would also be desirable to create smaller test modules (e.g. separate tests for arff input or str-labels)

PGijsbers avatar Jul 09 '19 22:07 PGijsbers

Initial effort was merged in #50. I think the gama/genetic_programming/compilers/scikitlearn.py code is reported as not covered due to it being executed in a separate process only (in system tests).

PGijsbers avatar Jul 10 '19 23:07 PGijsbers

More progress in #84. Coverage of subprocesses is definitely correctly reported. Despite reaching 90% coverage, how to properly test AutoML systems reliably and quickly is still not obvious to me.

PGijsbers avatar Apr 08 '20 12:04 PGijsbers

How to adequately test AutoML systems remains an open issue, especially how to do this well with unit tests as opposed to simply running a (small) benchmark. Updating the title to reflect that more general code coverage is (currently) not a goal, but rather to test for more potential issues in the unit test suite.

PGijsbers avatar Sep 16 '22 08:09 PGijsbers