junit5 icon indicating copy to clipboard operation
junit5 copied to clipboard

`TestWatcher.testSkipped()` should be invoked for tests not executed due to failures in `@BeforeAll`

Open joerg1985 opened this issue 11 months ago • 6 comments

I am using the TestWatcher to collect information for reporting the status of tests by implementing testDisabled, testSuccessful, testAborted, and testFailed. This is working fine as long as there is no failure inside a @BeforeAll or postProcessTestInstance method, in which case none of the TestWatcher methods is called, and the test is kind of lost.

To detect this state, a lot of other interfaces must be implemented, and information has to be combined. It would be better to have a complete view of the test results using only the TestWatcher.

Deliverables

  • [ ] TestWatcher.testSkipped(ExtensionContext context, ExtensionContext origin, Throwable reason) should be invoked for tests not executed at all
    • ExtensionContext origin argument with details about the failed ExtensionContext causing the test to get skipped
    • Throwable reason argument with details about the failure

joerg1985 avatar Jan 07 '25 13:01 joerg1985

You are referring to testDisabled, not testSkipped, right?

If so, this works as documented:

https://github.com/junit-team/junit5/blob/ee52c64336ad3a079d598bd1c3884bafbc285047/junit-jupiter-api/src/main/java/org/junit/jupiter/api/extension/TestWatcher.java#L29-L36

Have you considered registering a TestExecutionListener instead of using TestWatcher? It would receive a call to executionFinished with a failed result for the test class and you could then inspect its children via the TestPlan supplied to testPlanExecutionStarted.

marcphilipp avatar Jan 08 '25 10:01 marcphilipp

@marcphilipp i am not refering to testDisabled, the feature request is about a new method testSkipped (or better testBlocked?) The way using the TestExecutionListener is possible and this is my current workaround, adding alot of complexity.

Implementing a TestWatcher is mutch easier and should - in my mind - cover all the states of all tests, otherwise people might miss some tests and they are e.g. not included into their reporting.

joerg1985 avatar Jan 08 '25 12:01 joerg1985

Team decision: Waiting for additional interest from the community

marcphilipp avatar Jan 17 '25 11:01 marcphilipp

@sureshg @Artem034 @BlueIce @DanmyWei @jmrg82 @vyarosh @simonovdenis @serrss @tcfurrer@xkamil @sbrannen @igor-dmitriev @sureshg You all liked (👍) the original feature so you might be interrested in this change to the feature? If so it would be great to have some feedback, thanks in advance.

joerg1985 avatar Jan 28 '25 21:01 joerg1985

It will be clearer and more helpful to have tests in the report that are not run due to issues in @BeforeAll but are expected to be. I would also like to see them marked as skipped in the report

Artem034 avatar Feb 03 '25 11:02 Artem034

The issue description provided by @joerg1985 perfectly matches our use case.

We launch a cluster of Apache Ignite nodes in @BeforeAll, which can fail because of configuration errors or insufficient resources on a CI runner. Since we rely on a custom test reporting solution that uses a TestWatcher under the hood, it’s challenging to report these failures properly. Adding testSkipped() would certainly make our lives easier in this scenario.

@marcphilipp @joerg1985, I’m ready to take a stab at this to help move it forward.

vdmitrienko avatar Feb 23 '25 13:02 vdmitrienko

@sbrannen @marcphilipp i stumbled again into this issue for another project and for me it looks like this issue has some interrrest from the community.

Would you like to discuss this internally again or is this issue open for PRs?

joerg1985 avatar Aug 29 '25 09:08 joerg1985

@vdmitrienko @joerg1985 Thanks for offering to work on this. However, since this issue has a single upvote, we're not looking for a PR at this point.

marcphilipp avatar Sep 06 '25 11:09 marcphilipp