pytest-relaxed
pytest-relaxed copied to clipboard
Two tests fail when warnings are present in the test run
trafficstars
The following two tests fail if there are any warnings in the test run:
test_shows_tests_nested_under_classes_without_files test_tests_are_colorized_by_test_result
with failure output:
> assert "== 1 failed, 4 passed, 1 skipped in " in output
E AssertionError: assert '== 1 failed, 4 passed, 1 skipped in ' in '============================= test session starts ==============================\nplatform freebsd13 -- Python.../en/latest/warnings.html\n========== 1 failed, 4 passed, 1 skipped, 1 warnings in 0.28 seconds ==========='
=================================== FAILURES ===================================
_________________________ OtherBehaviors.behavior_four _________________________
self = <other_behaviors.OtherBehaviors instance at 0x80467f9e0>
def behavior_four(self):
> assert False
E AssertionError
other_behaviors.py:13: AssertionError
=============================== warnings summary ===============================
behaviors.py::Behaviors::behavior_one
/usr/local/lib/python2.7/site-packages/pytest_relaxed/reporter.py:79: UserWarning: Argument(s) ('config',) which are declared in the hookspec can not be found in this hook call
cat, letter, word = status_getter(report=report)
-- Docs: https://docs.pytest.org/en/latest/warnings.html
========== 1 failed, 4 passed, 1 skipped, 1 warnings in 0.28 seconds ===========
I tried --disable-warnings for the test run, but this may not (it seems?) apply to captured output.
Tests should account either for the possibility of X warnings in the output, or ignore warnings by removing in from the test string assertion
The following patch fixes the issue for me, if you decide you want to go that way (ignoring warnings, rather than accounting for them):
--- tests/test_display.py.orig 2019-06-14 18:05:29 UTC
+++ tests/test_display.py
@@ -18,7 +18,7 @@ def _expect_regular_output(testdir):
assert "== FAILURES ==" in output
assert "AssertionError" in output
# Summary
- assert "== 1 failed, 4 passed, 1 skipped in " in output
+ assert "== 1 failed, 4 passed, 1 skipped" in output
class TestRegularFunctions:
@@ -170,7 +170,7 @@ OtherBehaviors
assert "== FAILURES ==" in output
assert "AssertionError" in output
# Summary
- assert "== 1 failed, 4 passed, 1 skipped in " in output
+ assert "== 1 failed, 4 passed, 1 skipped" in output
def test_tests_are_colorized_by_test_result( # noqa: F811,E501
self, testdir, environ
@@ -225,7 +225,7 @@ OtherBehaviors
assert "== FAILURES ==" in output
assert "AssertionError" in output
# Summary
- assert "== 1 failed, 4 passed, 1 skipped in " in output
+ assert "== 1 failed, 4 passed, 1 skipped" in output
def test_nests_many_levels_deep_no_problem(self, testdir):
testdir.makepyfile(