pytest-sugar icon indicating copy to clipboard operation
pytest-sugar copied to clipboard

SugarVerbose: Print more verbose testcase reports for all o…

Open mitzkia opened this issue 6 years ago • 17 comments

…utcome

Signed-off-by: Andras Mitzki [email protected]

I have found that pytest-sugar can display each failed testcases which is a useful feature. For me it would be even better if it could display it for each pytest outcomes. The original idea came from the error outcomes which can happen in setup or teardown phases also and I would like to know when it happened. Maybe the option name (--sugar-verbose) is not the best, it is fine to change it.

Update: The functionality from using "--sugar-verbose" arg has been changed to use "native" pytest arg: "-ra"

Example to run this feature use the following execution:

$ python3 -m pytest -ra t.py 

Before

Results (0.05s):
       5 passed
       3 failed
         - t.py:4 test_B
         - t.py:7 test_C
         - t.py:10 test_D
       1 error

After

Results (0.05s):
       5 passed
         - t.py:1 test_A (call)
         - t.py:18 test_E (call)
         - t.py:23 test_F[3+5-8] (call)
         - t.py:23 test_F[2+4-6] (call)
         - t.py:23 test_F[6*9-42] (call)
       3 failed
         - t.py:4 test_B (call)
         - t.py:7 test_C (call)
         - t.py:10 test_D (call)
       1 error
         - t.py:18 test_E (teardown)

mitzkia avatar Jun 01 '19 08:06 mitzkia

Just a quick idea: maybe this should be based on / coupled with pytest's -r option (reportopts), i.e. -ra would enable it here then also.

blueyed avatar Jun 01 '19 08:06 blueyed

Codecov Report

Merging #175 into master will decrease coverage by 3.47%. The diff coverage is 52.94%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #175      +/-   ##
==========================================
- Coverage   85.32%   81.85%   -3.48%     
==========================================
  Files           2        2              
  Lines         477      496      +19     
  Branches       84       92       +8     
==========================================
- Hits          407      406       -1     
- Misses         44       57      +13     
- Partials       26       33       +7
Impacted Files Coverage Δ
pytest_sugar.py 76.32% <52.94%> (-4.56%) :arrow_down:

Continue to review full report at Codecov.

Legend - Click here to learn more Δ = absolute <relative> (impact), ø = not affected, ? = missing data Powered by Codecov. Last update a210316...066454a. Read the comment docs.

codecov-io avatar Jun 01 '19 08:06 codecov-io

Thanks @blueyed I will check it.

mitzkia avatar Jun 01 '19 08:06 mitzkia

I have checked it and found some issues. 1, It did not display where the testcase error occurred: in setup or in teardown 2, There is a very different output when pytest-sugar is installed or not (for the same testcase)

Report output when pytest-sugar is not installed:

$ python3 -m pytest -rA t.py
platform linux -- Python 3.6.7, pytest-4.5.0, py-1.6.0, pluggy-0.11.0
hypothesis profile 'default' -> database=DirectoryBasedExampleDatabase('/home/micek/na_akkor_ujra/backup/playground/python/play_with_pytest/.hypothesis/examples')
rootdir: /home/micek/na_akkor_ujra/backup/playground/python/play_with_pytest
plugins: xdist-1.28.0, repeat-0.8.0, icdiff-0.2, forked-1.0.2, hypothesis-4.23.4
collected 8 items  
...
============================================================ short test summary info =============================================================
ERROR t.py::test_E - AssertionError: assert 'aaa' == 'bbb'
FAILED t.py::test_B - AssertionError: assert equals failed
FAILED t.py::test_C - Exception: ('spam', 'eggs')
FAILED t.py::test_D - ZeroDivisionError: division by zero
PASSED t.py::test_A
PASSED t.py::test_E
PASSED t.py::test_F[3+5-8]
PASSED t.py::test_F[2+4-6]
PASSED t.py::test_F[6*9-42]

Report output when pytest-sugar is installed:

$ python3 -m pytest -rA t.py
Test session starts (platform: linux, Python 3.6.7, pytest 4.5.0, pytest-sugar 0.9.2)
hypothesis profile 'default' -> database=DirectoryBasedExampleDatabase('/home/micek/na_akkor_ujra/backup/playground/python/play_with_pytest/.hypothesis/examples')
rootdir: /home/micek/na_akkor_ujra/backup/playground/python/play_with_pytest
plugins: xdist-1.28.0, sugar-0.9.2, repeat-0.8.0, icdiff-0.2, forked-1.0.2, hypothesis-4.23.4
collecting ... 
...
============================================================ short test summary info =============================================================
FAILED t.py::test_B - AssertionError: assert equals failed
FAILED t.py::test_C - Exception: ('spam', 'eggs')
FAILED t.py::test_D - ZeroDivisionError: division by zero
FAILED t.py::test_E - AssertionError: assert 'aaa' == 'bbb'
PASSED t.py::test_A
PASSED t.py::test_A
PASSED t.py::test_A
PASSED t.py::test_B
PASSED t.py::test_B
PASSED t.py::test_C
PASSED t.py::test_C
PASSED t.py::test_D
PASSED t.py::test_D
PASSED t.py::test_E
PASSED t.py::test_E
PASSED t.py::test_F[3+5-8]
PASSED t.py::test_F[3+5-8]
PASSED t.py::test_F[3+5-8]
PASSED t.py::test_F[2+4-6]
PASSED t.py::test_F[2+4-6]
PASSED t.py::test_F[2+4-6]
PASSED t.py::test_F[6*9-42]
PASSED t.py::test_F[6*9-42]
PASSED t.py::test_F[6*9-42]

Results (0.05s):
       5 passed
       3 failed
         - t.py:4 test_B
         - t.py:7 test_C
         - t.py:10 test_D
       1 error

my example test file

def test_A():
    assert "aa" == "aa"

def test_B():
    assert "aa" == "bb"

def test_C():
    raise Exception('spam', 'eggs')

def test_D():
    a = 7
    b = 0
    a/b

def stop():
    assert "aaa" == "bbb"

def test_E(request):
    request.addfinalizer(stop)

import pytest

@pytest.mark.parametrize("test_input,expected", [("3+5", 8), ("2+4", 6), ("6*9", 42)])
def test_F(test_input, expected):
    assert True

mitzkia avatar Jun 01 '19 08:06 mitzkia

I can understand why there are so many testcases in the report, it counts: call, setup and teardown for each testcase. But there is an outcome change from: ERROR t.py::test_E - AssertionError: assert 'aaa' == 'bbb' to FAILED t.py::test_E - AssertionError: assert 'aaa' == 'bbb' which is more problematic as I think.

mitzkia avatar Jun 01 '19 08:06 mitzkia

Oh I understand you, maybe we did not need this new option: "--sugar-verbose", just enable this feature when -ra is defined in arguments. I will think about this.

mitzkia avatar Jun 01 '19 08:06 mitzkia

@blueyed Thanks for the note, I have removed the option and the feature can be enabled with native "-ra" arg

mitzkia avatar Jun 01 '19 09:06 mitzkia

Correct me if I am wrong, without fixing CI failures my PR can not be merged. As I saw there are already some PR(s) to fix those issues. My question is: how can I help to move forward my PR? Can I do some reviewing or testing (which PR(s) would fix CI issues?)?

mitzkia avatar Jun 09 '19 09:06 mitzkia

@mitzkia we have to wait for @frozenball here I assume.

blueyed avatar Jun 09 '19 10:06 blueyed

Ok, thank you for the answer.

mitzkia avatar Jun 09 '19 10:06 mitzkia

Actually I can merge things here, and CI should be fixed after merging https://github.com/Frozenball/pytest-sugar/pull/156 (which was approved / got no more feedback).

In general I am not using pytest-sugar myself by default.. (so do not expect too much help / reviewing from me here)

blueyed avatar Jun 09 '19 10:06 blueyed

... so apparently https://github.com/Frozenball/pytest-sugar/pull/156 was stalled for too long, and CI is still broken with it now: https://travis-ci.org/Frozenball/pytest-sugar/builds/543349387

Do you feel like fixing it? (from a quick look it appears there is an issue with xdist, and some other dep compatibility at least) From my point of view we would not really have to support pytest30 for example (if that's an issue), but it would be great to get feedback / opinion from @Frozenball as the owner here.

blueyed avatar Jun 09 '19 10:06 blueyed

Thanks for your answer. I would say lets wait for a while @Frozenball's answer. Soon I will check the broken CI and #156

mitzkia avatar Jun 09 '19 10:06 mitzkia

I am trying to fix it quickly in https://github.com/Frozenball/pytest-sugar/pull/177 already.

blueyed avatar Jun 09 '19 10:06 blueyed

Merged https://github.com/Frozenball/pytest-sugar/pull/177, so you are able to rebase your PR(s) at least for now.

blueyed avatar Jun 09 '19 10:06 blueyed

Thanks :) , I will do it

mitzkia avatar Jun 09 '19 10:06 mitzkia

Thanks again, I will fix the CI fail.

mitzkia avatar Jun 09 '19 10:06 mitzkia

Hey 👋

Thank you for your PR! I am going through the old PRs in this repository and closing them since a long time has passed since the opening and they may not be that relevant anymore. If you still wish to continue with this pull request, let me know and we can re-open it.

Teemu

Teemu avatar Nov 08 '22 15:11 Teemu