flaky icon indicating copy to clipboard operation
flaky copied to clipboard

Failure information for prior failing test executions is omitted from XML reports

Open eriwen opened this issue 2 years ago • 1 comments

Given a flaky test such as:

def test_random():
    assert bool(random.getrandbits(1))

The output of the JUnit XML contains no <failure> information when a test case fails and then subsequently passes.

It looks like this:

<?xml version="1.0" encoding="utf-8"?>
<testsuites>
<testsuite name="pytest" errors="0" failures="0" skipped="0" tests="2" time="0.040" timestamp="2021-10-12T17:31:35.919038" hostname="sans">
  <testcase classname="test_sample" name="test_random" file="test_sample.py" line="6" time="0.000" />
  <testcase classname="test_sample" name="test_random" file="test_sample.py" line="6" time="0.000" />
</testsuite>
</testsuites>

Other test frameworks and runners typically include this failure information like this so that automated tools can process flaky test information like this in a more structured format than plain-text logs.

<failure message="assert False&#10; +  where False = bool(0)&#10; +    where 0 = &lt;built-in method getrandbits of Random object at 0x7f93c1078610&gt;(1)&#10; +      where &lt;built-in method getrandbits of Random object at 0x7f93c1078610&gt; = random.getrandbits">
        def test_random():
    &gt;       assert bool(random.getrandbits(1))
    E       assert False
    E        +  where False = bool(0)
    E        +    where 0 = &lt;built-in method getrandbits of Random object at 0x7f93c1078610&gt;(1)
    E        +      where &lt;built-in method getrandbits of Random object at 0x7f93c1078610&gt; = random.getrandbits

    test_sample.py:9: AssertionError</failure>

In order to avoid parsers mistaking the <failure> for deterministic, non-flaky failure, some test runners use <flakyFailure>. Even better, some test frameworks go so far as to mark the <testcase> as flaky="true". However, I'm not sure this library has that level of control to achieve these things, but if it does all the better for devs trying to fix flaky tests.

eriwen avatar Oct 13 '21 17:10 eriwen

Did you ever manage to resolve this @eriwen ?

shrihari-prakash avatar Jul 03 '23 09:07 shrihari-prakash