reporter icon indicating copy to clipboard operation
reporter copied to clipboard

In the "**Runs**" tab, the real status for tests with the **@mark.xfail** mark is not displayed

Open BohdanObruch opened this issue 2 years ago • 14 comments

Describe the bug In the "Runs" tab, the real status for tests with the @mark.xfail mark is not displayed

Precondition Create a test and add pytest mark with this test @mark.xfail

To Reproduce Steps to reproduce:

  1. Run the test with the mark @mark.xfail
  2. Check the test result in the console
  3. Go to the testomat.io website
  4. Go to the Runs tab
  5. Open the last launched test
  6. Check the display of the status of the test result

Expected behavior The status of the test should be "xpassed" and with a display of why. For example, how the reason for failed tests is displayed

Screenshots 1

2

Desktop (please complete the following information):

  • Python 3.10.4
  • Pytest 7.4.3
  • PyCharm 2023.2.3 (Professional Edition)

BohdanObruch avatar Nov 01 '23 07:11 BohdanObruch

@BohdanObruch what did you use to report the results to testomat.io? please give command example from terminal

poliarush avatar Nov 02 '23 14:11 poliarush

I used the pytest-analyzer plugin, mark.skip appears in the report, but mark.xfail does not @ 2

BohdanObruch avatar Nov 02 '23 14:11 BohdanObruch

I use pytest --analyzer sync command

BohdanObruch avatar Nov 02 '23 14:11 BohdanObruch

and what is somewhat interesting: if I run tests by command in the console set TESTOMATIO=api_key; pytest --analyzer sync , for some reason does not accept the installation of the TESTOMATIO variable, indicates an error, only when I manually add it to the PC environment variable, it works

BohdanObruch avatar Nov 02 '23 14:11 BohdanObruch

will check and fix on weekend

Ypurek avatar Nov 02 '23 17:11 Ypurek

@BohdanObruch sorry for a delay. I was sick. I've looked through your issue and cannot figure out how to help you. There is no such status in testomat.io like xpassed. There only 3 @poliarush pls correct me if I am wrong

Ypurek avatar Nov 20 '23 14:11 Ypurek

Hello, thank you for your reply Oleksii. I think you mean these statuses, but what about this pytest mark? Display in "Passed" status in testomat.io is not correct 1

BohdanObruch avatar Nov 20 '23 14:11 BohdanObruch

do you suggest to fail this test in testomat.io?

Ypurek avatar Nov 20 '23 15:11 Ypurek

image pycharm considers it as passed

Ypurek avatar Nov 20 '23 15:11 Ypurek

  1. in the photo, you run the test by clicking the launch icon (triangle in the line of the name of the test) or by clicking on the file, but I will show you the command to run the tests through the console "pytest ..."
  2. I added a photo of what this status looks like in Allure, there it is clear what kind of status it is and why exactly (mark displayed is @pytest.mark.xfail) 2

BohdanObruch avatar Nov 20 '23 16:11 BohdanObruch

with such a launch as you indicated there, regardless of the pytest marks in the test, only the status of either passed or failed will be displayed

BohdanObruch avatar Nov 20 '23 16:11 BohdanObruch

@Ypurek @BohdanObruch from testomat.io point of view there only 3 statues:

  • passed
  • failed
  • skipped

it's tricky part how to treat xfail and xpass

  • xfail = failed or passed ?!
  • xpass = passed or failed ?!

I think, the mapping from pytest's statuses to testomat.io's statuses should be like this:

  • pass = passed
  • fail = failed
  • skip = skipped
  • skipif = skipped
  • xfail = failed
  • xpass = passed

However, I suggest to put message for test case for following statuses to it's visible in the list:

2023-11-23_21 46 39@2x

2023-11-23_21 50 45@2x

  • skip, reason if available, example "pytest.mark.skip("all tests still WIP")"
  • skipif, reason if available, example "pytest.mark.skipif(sys.platform == "win32", reason="does not run on windows")"
  • xfail, reason if available, example "Expected failure with pytest.mark.xfail(sys.platform == "win32", reason="bug in a 3rd party library")"
  • xpass, example "Test passed but expected to fail with pytest.mark.xfail(reason="bug in a 3rd party library")"

@DavertMik I think we need to introduce labels and custom fields for testrun

poliarush avatar Nov 23 '23 20:11 poliarush

@poliarush thank you, I agree with you

BohdanObruch avatar Nov 24 '23 02:11 BohdanObruch

related to https://github.com/testomatio/app/issues/881

@BohdanObruch also, i've created a new item to introduce labels and custom fields for testrun, so then you can customly assign necessary labels to testruns and filter by it with some statistics

poliarush avatar Nov 24 '23 18:11 poliarush

Closed in favor of https://github.com/testomatio/app/issues/881

DavertMik avatar Aug 19 '24 10:08 DavertMik

related to https://github.com/testomatio/moonactive-issues/issues/208

poliarush avatar Aug 19 '24 12:08 poliarush