botan icon indicating copy to clipboard operation
botan copied to clipboard

Test: Allow reporting results as JUnit

Open reneme opened this issue 1 year ago • 3 comments

Based on the refactoring in #3027 this adds an XmlReporter that emits JUnit5 compliant XML test results. The results are written to disk into a directory specified via ./botan-test --test-results-dir=... as Botan-3.0.0-alpha0-tests-<current timestamp>.xml.

Eventually, we're planning to visualize the reports into pull requests using external tools, like:

  • https://www.check-run-reporter.com/
  • https://github.com/mikepenz/action-junit-report

Lessons Learned

  • it doesn't seem worth it to use <system-out> tags
    • also: putting result.test_note() output in there produces a lot of output
    • [ ] instead: just produce the test_note() tag for failing tests

Example Output

./botan-test --test-results-dir=test-results block

<?xml version="1.0" encoding="UTF-8"?>
<properties>
<property name="CPU flags" value="sse2 ssse3 sse41 sse42 avx2 avx512f avx512dq avx512bw avx512_icelake rdtsc bmi1 bmi2 adx aes_ni clmul rdrand rdseed intel_sha avx512_aes avx512_clmul" />
<property name="current test run" value="1" />
<property name="drbg_seed" value="17082F3088D3AC57" />
<property name="total test runs" value="1" />
</properties>
<testsuites tests="27" failures="0" time="0.213">
<testsuite name="block" tests="27" failures="0" timestamp="2022-08-04T17:52:28+0200" time="0.207">
<testcase name="AES-128" assertions="17235" timestamp="2022-08-04T17:52:28+0200" time="0.016" />
<testcase name="AES-192" assertions="20250" timestamp="2022-08-04T17:52:28+0200" time="0.020" />
<testcase name="AES-256" assertions="23130" timestamp="2022-08-04T17:52:28+0200" time="0.022" />
<testcase name="ARIA-128" assertions="30" timestamp="2022-08-04T17:52:28+0200" time="0.000" />
<testcase name="ARIA-192" assertions="30" timestamp="2022-08-04T17:52:28+0200" time="0.000" />
<testcase name="ARIA-256" assertions="30" timestamp="2022-08-04T17:52:28+0200" time="0.000" />
<testcase name="Blowfish" assertions="930" timestamp="2022-08-04T17:52:28+0200" time="0.005" />
<testcase name="CAST-128" assertions="720" timestamp="2022-08-04T17:52:28+0200" time="0.001" />
<testcase name="Camellia-128" assertions="90" timestamp="2022-08-04T17:52:28+0200" time="0.000" />
<testcase name="Camellia-192" assertions="45" timestamp="2022-08-04T17:52:28+0200" time="0.000" />
<testcase name="Camellia-256" assertions="75" timestamp="2022-08-04T17:52:28+0200" time="0.000" />
<testcase name="Cascade(Serpent,AES-256)" assertions="30" timestamp="2022-08-04T17:52:28+0200" time="0.000" />
<testcase name="Cascade(Serpent,CAST-128)" assertions="15" timestamp="2022-08-04T17:52:28+0200" time="0.000" />
<testcase name="Cascade(Serpent,Twofish)" assertions="45" timestamp="2022-08-04T17:52:28+0200" time="0.000" />
<testcase name="DES" assertions="4815" timestamp="2022-08-04T17:52:28+0200" time="0.003" />
<testcase name="GOST-28147-89(R3411_94_TestParam)" assertions="270" timestamp="2022-08-04T17:52:28+0200" time="0.000" />
<testcase name="GOST-28147-89(R3411_CryptoPro)" assertions="150" timestamp="2022-08-04T17:52:28+0200" time="0.000" />
<testcase name="IDEA" assertions="16260" timestamp="2022-08-04T17:52:28+0200" time="0.018" />
<testcase name="Lion(SHA-160,RC4,64)" assertions="15" timestamp="2022-08-04T17:52:28+0200" time="0.000" />
<testcase name="Noekeon" assertions="15450" timestamp="2022-08-04T17:52:28+0200" time="0.011" />
<testcase name="SEED" assertions="60" timestamp="2022-08-04T17:52:28+0200" time="0.000" />
<testcase name="SHACAL2" assertions="61260" timestamp="2022-08-04T17:52:28+0200" time="0.052" />
<testcase name="SM4" assertions="45" timestamp="2022-08-04T17:52:28+0200" time="0.000" />
<testcase name="Serpent" assertions="47070" timestamp="2022-08-04T17:52:28+0200" time="0.040" />
<testcase name="Threefish-512" assertions="150" timestamp="2022-08-04T17:52:28+0200" time="0.000" />
<testcase name="TripleDES" assertions="840" timestamp="2022-08-04T17:52:28+0200" time="0.001" />
<testcase name="Twofish" assertions="16545" timestamp="2022-08-04T17:52:28+0200" time="0.017" />
</testsuite>
</testsuites>

reneme avatar Aug 04 '22 16:08 reneme

I tried mikepenz/action-junit-report and with a few more tweaks this can annotate failing test cases right into the code:

Screenshot 2022-08-05 121419

Obviously, this works only if the failure information contains code locations: I.e. we would need to use some macro-tricks to make this happen. (And touch all the test code to incorporate the macros). Also, this little hack would probably come back to bite us.

But even without that, it generates test results in the GitHub actions summary view: Screenshot 2022-08-05 121751

All of this comes at the expense of using two 3rd party GitHub Actions:

Open Questions

  • How to annotate test failures with the build configuration that failed?
  • Is there a better way to annotate test failures with code locations?

reneme avatar Aug 05 '22 10:08 reneme

Now for check-run-reporter.com: This produces a somewhat more comprehensive report as a single summary page behind a link in the "Checks"-Box of a Pull Request:

Screenshot 2022-08-05 172553


Screenshot 2022-08-05 171710

Though, if the same test fails for more than one build configuration it seems to overwrite previous fails, which might or might not be problematic.

On the plus side, it's much easier to set up and can work without a custom 3rd-party action. I.e. one can manually curl -X POST the generated JUnit results to the service.

It has a free-tier for open source projects that is capped to 10.000 monthly reports and 1GB of transfer. That may or may not be enough.

Bottom line: neither of these two services is really convincing unfortunately. @randombit What do you think? Do you see this bringing enough added value?

Frankly: I'm somewhat feeling the urge to write a simple action that is tailored to our needs... Particularly considering other to-be-integrated reports from BoGo, Python-based tests and so on.

reneme avatar Aug 05 '22 15:08 reneme

Codecov Report

Base: 92.56% // Head: 92.60% // Increases project coverage by +0.04% :tada:

Coverage data is based on head (783a9d3) compared to base (4090574). Patch coverage: 89.60% of modified lines in pull request are covered.

Additional details and impacted files
@@            Coverage Diff             @@
##           master    #3028      +/-   ##
==========================================
+ Coverage   92.56%   92.60%   +0.04%     
==========================================
  Files         602      603       +1     
  Lines       70650    70854     +204     
  Branches     6662     6685      +23     
==========================================
+ Hits        65398    65616     +218     
+ Misses       5222     5208      -14     
  Partials       30       30              
Impacted Files Coverage Δ
src/tests/runner/test_runner.h 100.00% <ø> (ø)
src/tests/runner/test_reporter.cpp 87.09% <77.14%> (-12.91%) :arrow_down:
src/tests/runner/test_reporter.h 85.71% <83.33%> (+55.71%) :arrow_up:
src/tests/tests.cpp 83.21% <90.00%> (+0.12%) :arrow_up:
src/tests/runner/test_xml_reporter.cpp 92.02% <92.02%> (ø)
src/tests/main.cpp 57.77% <100.00%> (+1.96%) :arrow_up:
src/tests/runner/test_runner.cpp 72.05% <100.00%> (+0.41%) :arrow_up:
src/tests/tests.h 92.57% <100.00%> (+0.42%) :arrow_up:
src/fuzzer/ressol.cpp 85.71% <0.00%> (-7.15%) :arrow_down:
src/fuzzer/mode_padding.cpp 93.65% <0.00%> (-6.35%) :arrow_down:
... and 13 more

Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.

:umbrella: View full report at Codecov.
:loudspeaker: Do you have feedback about the report comment? Let us know in this issue.

codecov-commenter avatar Sep 21 '22 14:09 codecov-commenter

Rebased onto https://github.com/randombit/botan/pull/3058 and added the code locations to the XML report. Hence. we should first review/merge the code locations and then rebase this again onto master.

Edit: done

reneme avatar Oct 11 '22 17:10 reneme

I know there are some unresolved questions here wrt reporting actions and how to improve the location reporting, but is there any reason this PR couldn't merge now?

randombit avatar Nov 14 '22 16:11 randombit

Indeed there's no show stopper. The JUnit reports just won't be used as is. But we certainly can come back to that later.

reneme avatar Nov 15 '22 07:11 reneme

I incoporated the changes and fixes I needed here: https://github.com/Rohde-Schwarz/botan/pull/10 . Mainly, the addition of ./botan-test --report-properties=key:value,other_key:other_value to set additional properties in ci_build.py that can later be used when generating human-readable reports from the JUnit files.

From my side, this is ready to go in now.

reneme avatar Nov 23 '22 13:11 reneme