junit-plugin
junit-plugin copied to clipboard
java.lang.OutOfMemoryError: Java heap space when publishing large surefire reports
Jenkins and plugins versions report
Environment
Jenkins 2.263.4
What Operating System are you using (both controller, and any agents involved in the problem)?
RHEL 7
Reproduction steps
- Create a project that generates very large surefire reports
- Try to publish these using the junitPublisher
Expected Results
A successful run with the test results attached
Actual Results
A "successful" run. However, when looking into the console logs, we see this stacktrace:
ERROR: [withMaven] WARNING Exception executing Maven reporter 'Junit Publisher' / org.jenkinsci.plugins.pipeline.maven.publishers.JunitTestsPublisher. Please report a bug associated for the component 'pipeline-maven-plugin' at https://issues.jenkins-ci.org
java.io.IOException: Remote call on *** (***) failed
at hudson.remoting.Channel.call(Channel.java:1007)
at hudson.FilePath.act(FilePath.java:1157)
at hudson.FilePath.act(FilePath.java:1146)
at hudson.tasks.junit.JUnitParser.parseResult(JUnitParser.java:107)
at hudson.tasks.junit.JUnitResultArchiver.parse(JUnitResultArchiver.java:149)
at hudson.tasks.junit.JUnitResultArchiver.parseAndAttach(JUnitResultArchiver.java:180)
at org.jenkinsci.plugins.pipeline.maven.publishers.JunitTestsPublisher.executeReporter(JunitTestsPublisher.java:324)
at org.jenkinsci.plugins.pipeline.maven.publishers.JunitTestsPublisher.process(JunitTestsPublisher.java:211)
at org.jenkinsci.plugins.pipeline.maven.MavenSpyLogProcessor.processMavenSpyLogs(MavenSpyLogProcessor.java:153)
at org.jenkinsci.plugins.pipeline.maven.WithMavenStepExecution2$WithMavenStepExecutionCallBack.finished(WithMavenStepExecution2.java:1097)
at org.jenkinsci.plugins.workflow.steps.GeneralNonBlockingStepExecution$TailCall.lambda$onSuccess$0(GeneralNonBlockingStepExecution.java:140)
at org.jenkinsci.plugins.workflow.steps.GeneralNonBlockingStepExecution.lambda$run$0(GeneralNonBlockingStepExecution.java:77)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.lang.OutOfMemoryError: Java heap space
at com.sun.org.apache.xerces.internal.xni.XMLString.append(XMLString.java:246)
at com.sun.org.apache.xerces.internal.impl.XMLEntityScanner.scanData(XMLEntityScanner.java:1330)
at com.sun.org.apache.xerces.internal.impl.XMLDocumentFragmentScannerImpl.scanCDATASection(XMLDocumentFragmentScannerImpl.java:1653)
at com.sun.org.apache.xerces.internal.impl.XMLDocumentFragmentScannerImpl$FragmentContentDriver.next(XMLDocumentFragmentScannerImpl.java:3013)
at com.sun.org.apache.xerces.internal.impl.XMLDocumentScannerImpl.next(XMLDocumentScannerImpl.java:601)
at com.sun.org.apache.xerces.internal.impl.XMLNSDocumentScannerImpl.next(XMLNSDocumentScannerImpl.java:112)
at com.sun.org.apache.xerces.internal.impl.XMLDocumentFragmentScannerImpl.scanDocument(XMLDocumentFragmentScannerImpl.java:504)
at com.sun.org.apache.xerces.internal.parsers.XML11Configuration.parse(XML11Configuration.java:841)
at com.sun.org.apache.xerces.internal.parsers.XML11Configuration.parse(XML11Configuration.java:770)
at com.sun.org.apache.xerces.internal.parsers.XMLParser.parse(XMLParser.java:141)
at com.sun.org.apache.xerces.internal.parsers.AbstractSAXParser.parse(AbstractSAXParser.java:1213)
at com.sun.org.apache.xerces.internal.jaxp.SAXParserImpl$JAXPSAXParser.parse(SAXParserImpl.java:642)
at org.dom4j.io.SAXReader.read(SAXReader.java:494)
at org.dom4j.io.SAXReader.read(SAXReader.java:392)
at hudson.tasks.junit.SuiteResult.parse(SuiteResult.java:178)
at hudson.tasks.junit.TestResult.parse(TestResult.java:378)
at hudson.tasks.junit.TestResult.parsePossiblyEmpty(TestResult.java:308)
at hudson.tasks.junit.TestResult.parse(TestResult.java:224)
at hudson.tasks.junit.TestResult.parse(TestResult.java:196)
at hudson.tasks.junit.TestResult.<init>(TestResult.java:151)
at hudson.tasks.junit.JUnitParser$ParseResultCallable.invoke(JUnitParser.java:144)
at hudson.FilePath$FileCallableWrapper.call(FilePath.java:3313)
at hudson.remoting.UserRequest.perform(UserRequest.java:211)
at hudson.remoting.UserRequest.perform(UserRequest.java:54)
at hudson.remoting.Request$2.run(Request.java:375)
at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:73)
... 4 more
Anything else?
Note that the pipeline-maven-plugin catches all exceptions and logs these to the console. Therefore, the build reports successful, while in reality, there could be failed tests and the junitPublisher has failed.
Have you got an example that can reproduce this?
I don't think that this is something that any program can prevent: if you feed a too large input into a parser, then at some point the parser will occupy too much memory. The only solution to this problem is to increase the heap size of your Maven step.
@timja I couldn't get a working minimal example using withMaven
. However, this triggers a similar "Java heap space" error:
pipeline {
agent {label "rhel7"}
stages {
stage('Generate surefire report') {
steps {
sh """
mkdir -p target/surefire-reports
echo '<?xml version="1.0" encoding="UTF-8"?>
<testsuite xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:noNamespaceSchemaLocation="https://maven.apache.org/surefire/maven-surefire-plugin/xsd/surefire-test-report-3.0.xsd" version="3.0" name="junit-oom-test" time="171.077" tests="1" errors="0" skipped="0" failures="0">
<testcase name="runCucumber" classname="junit-oom-test" time="171.075">
<system-out><![CDATA[' > target/surefire-reports/surefire-report.xml
base64 /dev/urandom | head -c 1000000000 >> target/surefire-reports/surefire-report.xml
echo '
]]></system-out>
</testcase>
</testsuite>' >> target/surefire-reports/surefire-report.xml
"""
}
}
}
post {
always {
junit 'target/**/*.xml'
}
}
}
@uhafner That depends on the way the parser works. If the parser works in a streaming fashion, it should also be able to deal with large files.
I understand, however, that in this case, it is difficult to prevent a OOM, since the report contains one big string in a single XML node ("system-output").
Reducing the amount of logging on our end would help, but this is not always acceptable, unfortunately.
I was facing this problem due to a very large number of files to find in the workspace, I was able to workaround it by running a shell command to find all my TEST*.xml file and copying them into a single folder and running junit against that. this was not a pipeline job but a freestyle job and the error did kill the job.
related issue enhancement? https://github.com/jenkinsci/junit-plugin/issues/478