In the "**Runs**" tab, the real status for tests with the **@mark.xfail** mark is not displayed
Describe the bug In the "Runs" tab, the real status for tests with the @mark.xfail mark is not displayed
Precondition Create a test and add pytest mark with this test @mark.xfail
To Reproduce Steps to reproduce:
- Run the test with the mark @mark.xfail
- Check the test result in the console
- Go to the testomat.io website
- Go to the Runs tab
- Open the last launched test
- Check the display of the status of the test result
Expected behavior The status of the test should be "xpassed" and with a display of why. For example, how the reason for failed tests is displayed
Screenshots
Desktop (please complete the following information):
- Python 3.10.4
- Pytest 7.4.3
- PyCharm 2023.2.3 (Professional Edition)
@BohdanObruch what did you use to report the results to testomat.io? please give command example from terminal
I used the pytest-analyzer plugin,
mark.skip appears in the report, but mark.xfail does not
@
I use pytest --analyzer sync command
and what is somewhat interesting:
if I run tests by command in the console
set TESTOMATIO=api_key; pytest --analyzer sync , for some reason does not accept the installation of the TESTOMATIO variable, indicates an error, only when I manually add it to the PC environment variable, it works
will check and fix on weekend
@BohdanObruch sorry for a delay. I was sick. I've looked through your issue and cannot figure out how to help you. There is no such status in testomat.io like xpassed. There only 3 @poliarush pls correct me if I am wrong
Hello, thank you for your reply Oleksii.
I think you mean these statuses, but what about this pytest mark? Display in "Passed" status in testomat.io is not correct
do you suggest to fail this test in testomat.io?
pycharm considers it as passed
- in the photo, you run the test by clicking the launch icon (triangle in the line of the name of the test) or by clicking on the file, but I will show you the command to run the tests through the console "
pytest ..." - I added a photo of what this status looks like in Allure, there it is clear what kind of status it is and why exactly (mark displayed is @pytest.mark.xfail)
with such a launch as you indicated there, regardless of the pytest marks in the test, only the status of either passed or failed will be displayed
@Ypurek @BohdanObruch from testomat.io point of view there only 3 statues:
- passed
- failed
- skipped
it's tricky part how to treat xfail and xpass
- xfail = failed or passed ?!
- xpass = passed or failed ?!
I think, the mapping from pytest's statuses to testomat.io's statuses should be like this:
- pass = passed
- fail = failed
- skip = skipped
- skipif = skipped
- xfail = failed
- xpass = passed
However, I suggest to put message for test case for following statuses to it's visible in the list:
- skip, reason if available, example "pytest.mark.skip("all tests still WIP")"
- skipif, reason if available, example "pytest.mark.skipif(sys.platform == "win32", reason="does not run on windows")"
- xfail, reason if available, example "Expected failure with pytest.mark.xfail(sys.platform == "win32", reason="bug in a 3rd party library")"
- xpass, example "Test passed but expected to fail with pytest.mark.xfail(reason="bug in a 3rd party library")"
@DavertMik I think we need to introduce labels and custom fields for testrun
@poliarush thank you, I agree with you
related to https://github.com/testomatio/app/issues/881
@BohdanObruch also, i've created a new item to introduce labels and custom fields for testrun, so then you can customly assign necessary labels to testruns and filter by it with some statistics
Closed in favor of https://github.com/testomatio/app/issues/881
related to https://github.com/testomatio/moonactive-issues/issues/208