test-drive
test-drive copied to clipboard
Adding output to JUnit.xml with JUnit schema files.
I have attached two changed source files (replacing src/testdrive.f90 and test/main.f90) as well as the resulting JUnit.xml file that is written along with the screen output. This xml is doing the job for us in an Azure DevOps pipeline based on the data that is also written on the screen. The main.f90 uses 4 additional external subroutines from testdrive.f90. Two to open and close the Junit.xml and two to open and close the xml tag. The rest of the xml output is covered internally in testdrive.f90. However, I assume that you would prefer to avoid these additional functions, but I choose a simple approach with minimal code changes, instead of a major rework of the module structure.
In addition, the JUnit.xml file references to an xml schema (JUnit.xsd from https://github.com/windyroad/JUnit-Schema). In theory the JUnit.xml should be conform to this schema, but it is not for several reasons:
There are missing (required) xml tag attributes, like number of all/failed/succeeded/skipped tests per testsuite that are as summarized data not available at the time, when the according opening xml tag is written. There are missing xml tag attributes, like execution time, that are not stored in the code (as far as I could see). There is no corresponding xml tag in the schema file for the additional stdout/stderr output per testcase. Therefore, I added according xml tags () per testcase that are - following the xsd - only allowed once per testsuite. Fortunately, Azure DevOps does not care about the discrepancy, so we are fine with this setup (but it is not a generic solution). I have also attached another xml schema file (jenkins-junit.xsd from https://github.com/junit-team/junit5 resp. https://junit.org/junit5/). To be honest I am not sure, what should be regarded as the standard for a JUnit.xml. In fact, I do not care much, since Azure DevOps test pipeline works fine. Others, however, may have a more specific opinion on the subject.
Let me know, if you want to discuss my changes or whether I can support you. Thank you for considering my changes.
Interesting. the JUnit.xml file could be acutally also used in the CI/CD of Gitlab. The current generated xml file is already recognized by Gitlab (still a few issues with the name of the suite not recognized by Gitlab). Execution time would be useful too.
Thanks, this looks pretty straight forward to add and I would be happy to accept this in test-drive. One general comment, the generation of the JUnit.xml is incremental, which means if the test run is aborted the partly generated JUnit.xml file is always invalid. Could we have a process which guarantees that the JUnit.xml file can only be written completely?
Good point. Originally, I tried to use a FINAL statement for the module to make sure that the final closing tag is written and the xml file is closed, even in the error case. However, I failed. My final subroutine was never executed. In any case the nested structure of the xml file requires either to track the current depth, when ABORT, ERROR, ... occurs or to use one FINAL statement per interface of the module. What do you think?
FYI: https://fortran-lang.discourse.group/t/when-a-final-subroutine-is-called/3246
@awvwgk: With respect to your change request:
Thanks, this looks pretty straight forward to add and I would be happy to accept this in test-drive. One general comment, the generation of the JUnit.xml is incremental, which means if the test run is aborted the partly generated JUnit.xml file is always invalid. Could we have a process which guarantees that the JUnit.xml file can only be written completely?
A delayed output of the file at the end of application is fmho not a reasonable option (see https://github.com/fortran-lang/test-drive/pull/27#discussion_r1315170193). I can try to delete the file, if it could not be written completely in case of an error. Please, let me know your thoughts.
@awvwgk: With respect to your change request:
Thanks, this looks pretty straight forward to add and I would be happy to accept this in test-drive. One general comment, the generation of the JUnit.xml is incremental, which means if the test run is aborted the partly generated JUnit.xml file is always invalid. Could we have a process which guarantees that the JUnit.xml file can only be written completely?
A delayed output of the file at the end of application is fmho not a reasonable option (see #27 (comment)). I can try to delete the file, if it could not be written completely in case of an error. Please, let me know your thoughts.
My proposition to store the XML output in an optional variable and to write it to a file after running all tests could solve this issue.