pytest-subtests icon indicating copy to clipboard operation
pytest-subtests copied to clipboard

XML output lacks `pass`ed subtests info

Open stefano-ottolenghi opened this issue 2 years ago • 2 comments
trafficstars

(I recommend copy-pasting the XML content from this issue into files, and then opening those in a web-browser. The tags nesting will be much more apparent.)

Take this code:

import pytest

@pytest.mark.parametrize('n', [0,2,4,0,3,6,0,5,10])
class TestClass:
    def test_func(self, n):
        print(n)
        self.run_single(n)

    def run_single(self, n):
        if n == 2 or n == 10:
            pytest.skip()

        assert n%2 == 0, 'n is odd'

When run with pytest pytest-subtest.py --junitxml=out-subtest.xml, the XML file it produces is the following:

<?xml version="1.0" encoding="utf-8"?><testsuites><testsuite name="pytest" errors="0" failures="2" skipped="2" tests="9" time="0.041" timestamp="2023-02-22T13:10:30.982095" hostname="stefano-XPS"><testcase classname="pytest-regular.TestClass" name="test_func[00]" time="0.001" /><testcase classname="pytest-regular.TestClass" name="test_func[2]" time="0.000"><skipped type="pytest.skip" message="Skipped">/home/stefano/git/neo4j-experiments/pytest-regular.py:11: Skipped</skipped></testcase><testcase classname="pytest-regular.TestClass" name="test_func[4]" time="0.000" /><testcase classname="pytest-regular.TestClass" name="test_func[01]" time="0.000" /><testcase classname="pytest-regular.TestClass" name="test_func[3]" time="0.001"><failure message="AssertionError: n is odd&#10;assert (3 % 2) == 0">self = &lt;pytest-regular.TestClass object at 0x7f7770985300&gt;, n = 3

    def test_func(self, n):
        print(n)
&gt;       self.run_single(n)

pytest-regular.py:7: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = &lt;pytest-regular.TestClass object at 0x7f7770985300&gt;, n = 3

    def run_single(self, n):
        if n == 2 or n == 10:
            pytest.skip()
    
&gt;       assert n%2 == 0, 'n is odd'
E       AssertionError: n is odd
E       assert (3 % 2) == 0

pytest-regular.py:13: AssertionError</failure></testcase><testcase classname="pytest-regular.TestClass" name="test_func[6]" time="0.000" /><testcase classname="pytest-regular.TestClass" name="test_func[02]" time="0.000" /><testcase classname="pytest-regular.TestClass" name="test_func[5]" time="0.001"><failure message="AssertionError: n is odd&#10;assert (5 % 2) == 0">self = &lt;pytest-regular.TestClass object at 0x7f77709854b0&gt;, n = 5

    def test_func(self, n):
        print(n)
&gt;       self.run_single(n)

pytest-regular.py:7: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = &lt;pytest-regular.TestClass object at 0x7f77709854b0&gt;, n = 5

    def run_single(self, n):
        if n == 2 or n == 10:
            pytest.skip()
    
&gt;       assert n%2 == 0, 'n is odd'
E       AssertionError: n is odd
E       assert (5 % 2) == 0

pytest-regular.py:13: AssertionError</failure></testcase><testcase classname="pytest-regular.TestClass" name="test_func[10]" time="0.000"><skipped type="pytest.skip" message="Skipped">/home/stefano/git/neo4j-experiments/pytest-regular.py:11: Skipped</skipped></testcase></testsuite></testsuites>

Screenshot from 2023-02-22 13-41-56

I tweaked that code to run the exact same test cases, but split in 3 tests of 3 subtests each:

import pytest

@pytest.mark.parametrize('start', [2,3,5])
class TestClass:
    def test_func(self, subtests, start):
        print(start)
        for multiplier in range(3):
            with subtests.test():
                n = start*multiplier
                self.run_single(n)

    def run_single(self, n):
        if n == 6 or n == 10:
            pytest.skip()

        assert n%2 == 0, 'n is odd'

the resulting XML of which is:

<?xml version="1.0" encoding="utf-8"?><testsuites><testsuite name="pytest" errors="0" failures="2" skipped="2" tests="12" time="0.041" timestamp="2023-02-22T13:10:24.166299" hostname="stefano-XPS"><testcase classname="pytest-subtest.TestClass" name="test_func[2]" time="0.007"><skipped type="pytest.skip" message="Skipped">/home/stefano/git/neo4j-experiments/pytest-subtest.py:15: Skipped</skipped></testcase><testcase classname="pytest-subtest.TestClass" name="test_func[3]" time="0.017"><failure message="AssertionError: n is odd&#10;assert (3 % 2) == 0">self = &lt;pytest-subtest.TestClass object at 0x7f8123d1e530&gt;
subtests = SubTests(ihook=&lt;_pytest.config.compat.PathAwareHookProxy object at 0x7f8124cf4190&gt;, suspend_capture_ctx=&lt;bound method ...te='started' _in_suspended=False&gt; _capture_fixture=None&gt;&gt;, request=&lt;SubRequest 'subtests' for &lt;Function test_func[3]&gt;&gt;)
start = 3

    def test_func(self, subtests, start):
        print(start)
        for multiplier in range(3):
            with subtests.test():
                n = start*multiplier
                print(n)
&gt;               self.run_single(n)

pytest-subtest.py:11: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = &lt;pytest-subtest.TestClass object at 0x7f8123d1e530&gt;, n = 3

    def run_single(self, n):
        if n == 2 or n == 10:
            pytest.skip()
    
&gt;       assert n%2 == 0, 'n is odd'
E       AssertionError: n is odd
E       assert (3 % 2) == 0

pytest-subtest.py:17: AssertionError</failure></testcase><testcase classname="pytest-subtest.TestClass" name="test_func[5]" time="0.004"><failure message="AssertionError: n is odd&#10;assert (5 % 2) == 0">self = &lt;pytest-subtest.TestClass object at 0x7f8123d1e410&gt;
subtests = SubTests(ihook=&lt;_pytest.config.compat.PathAwareHookProxy object at 0x7f8124cf4190&gt;, suspend_capture_ctx=&lt;bound method ...te='started' _in_suspended=False&gt; _capture_fixture=None&gt;&gt;, request=&lt;SubRequest 'subtests' for &lt;Function test_func[5]&gt;&gt;)
start = 5

    def test_func(self, subtests, start):
        print(start)
        for multiplier in range(3):
            with subtests.test():
                n = start*multiplier
                print(n)
&gt;               self.run_single(n)

pytest-subtest.py:11: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = &lt;pytest-subtest.TestClass object at 0x7f8123d1e410&gt;, n = 5

    def run_single(self, n):
        if n == 2 or n == 10:
            pytest.skip()
    
&gt;       assert n%2 == 0, 'n is odd'
E       AssertionError: n is odd
E       assert (5 % 2) == 0

pytest-subtest.py:17: AssertionError</failure><skipped type="pytest.skip" message="Skipped">/home/stefano/git/neo4j-experiments/pytest-subtest.py:15: Skipped</skipped></testcase></testsuite></testsuites>

Screenshot from 2023-02-22 13-41-44

In the subtest version, the xml lacks any information about passed subtests. There is info about failures/skips as nested tags inside a testcase, but while the non-subtests version has all tests listed out in separate testcase tags, the subtests one only lists tests and subtests with special status. This can skew off CI tools that count testcase tags. We have a few tens of tests that each spawn hundreds of subtests (in a scenario that makes sense, contrary to the stupid example here), and

  • we get a full test fail/skip if one subtest fails/is skipped
  • we get a single pass if all subtests pass in one test.

If we run 3 tests with 3 subtests each, and 2 subtests are skipped and one fails, my expectation would be the CI to report 1 failure, 2 skips, 7 pass (or 9 if we also consider the tests, I don't care). Instead, depending a bit on where the tests fail/skip, I can now get 1 failure, 2 skips, 0 pass.

Is there scope for improving on this?

stefano-ottolenghi avatar Feb 22 '23 12:02 stefano-ottolenghi

Hi @stefano-ottolenghi,

I did not take a deep look, but I believe that would need a tighter integration with the builtin junitxml plugin, which is not easy for an external plugin to do. This can probably be easier addressed when we integrate pytest-subtests into the core, as then the SubTestReport will be an official report, which the other plugins can handle accordingly.

For clarification:

my expectation would be the CI to report 1 failure, 2 skips, 7 pass (or 9 if we also consider the tests, I don't care). Instead, depending a bit on where the tests fail/skip, I can now get 1 failure, 2 skips, 0 pass.

Here you mean the report generated by your CI from reading the XML file, not from pytest's output summary, correct?

nicoddemus avatar Mar 01 '23 13:03 nicoddemus

Yeah, it will be great when subtests makes it into core.

Here you mean the report generated by your CI from reading the XML file, not from pytest's output summary, correct?

Correct :)

stefano-ottolenghi avatar Mar 02 '23 07:03 stefano-ottolenghi