OOM exception when running the buildHealth task with version 3.5.0
Hello and thanks for the great work. I found a possible issue after updating to the latest version of the plugin.
Plugin version 3.5.0 Gradle version 9.2.1
JDK version 21
(Optional) Kotlin and Kotlin Gradle Plugin (KGP) version kotlin("jvm") version "2.2.21"
Describe the bug
OOM exception when running the buildHealth task after updating to the lastest version 3.5.0 of the plugin and running ./gradlew clean buildHealth . The task that fails then is: Task :synthesizeDependenciesTest . Reverting to version 3.4.1 it works. I also check that after setting the Heap size to org.gradle.jvmargs=-Xmx1024M it seems to work. Nevertheless I would not expect to have to adapt the heap size after a version upgrade.
Additional context
Error while receiving file changes net.rubygrapefruit.platform.NativeException: Caught java.lang.OutOfMemoryError with message: Java heap space at org.gradle.fileevents.internal.AbstractNativeFileEventFunctions$NativeFileWatcher.executeRunLoop0(Native Method)
ERROR] [system.err] java.lang.OutOfMemoryError: Java heap space 2025-11-20T11:08:16.122+0200 [DEBUG] [sun.rmi.transport.tcp] RMI TCP Connection(13)-127.0.0.1: (port 59703) connection closed 2025-11-20T11:08:16.123+0200 [ERROR] [system.err] at org.gradle.fileevents.internal.AbstractNativeFileEventFunctions$NativeFileWatcher.executeRunLoop0(Native Method) 2025-11-20T11:08:16.122+0200 [DEBUG] [sun.rmi.transport.tcp] RMI Scheduler(0): close connection, socket: Socket[addr=localhost/127.0.0.1,port=17615,localport=60340] 2025-11-20T11:08:16.122+0200 [DEBUG] [org.gradle.internal.operations.DefaultBuildOperationRunner] Completing Build operation 'com.autonomousapps.tasks.SynthesizeDependenciesTask$SynthesizeDependenciesWorkAction' 2025-11-20T11:08:16.123+0200 [DEBUG] [org.gradle.internal.operations.DefaultBuildOperationRunner] Build operation 'com.autonomousapps.tasks.SynthesizeDependenciesTask$SynthesizeDependenciesWorkAction' completed 2025-11-20T11:08:16.123+0200 [ERROR] [system.err] at org.gradle.fileevents.internal.AbstractNativeFileEventFunctions$NativeFileWatcher.executeRunLoop(AbstractNativeFileEventFunctions.java:32)
Thanks for the report. My guess is that it has to do with this new feature in v3.5.0:
[Feat]: check binary compatibility between consumer and producer code.
You can test this, as I've added an opt-out for it, since it seemed plausible it might be expensive. Please run your build again, but this time with -Pdependency.analysis.check-binary-compat=false. If that succeeds, then that's good evidence the new feature is leading to increased heap usage.
Additionally, would you mind sharing a heap dump with me? I'd love to take a look to see if I can find ways to reduce the resource consumption of the plugin. You could email it to me if you're concerned about sharing a heap dump in the open.
Adding -Pdependency.analysis.check-binary-compat=false worked for me as I'm facing the same issue.
I'd prefer not to adjust my heap size as this has traditionally caused unrelated issues, do you anticipate there being any other options?
Adding
-Pdependency.analysis.check-binary-compat=falseworked for me as I'm facing the same issue.I'd prefer not to adjust my heap size as this has traditionally caused unrelated issues, do you anticipate there being any other options?
Once someone sends me a heap dump, I can analyze it to see where I might be able to make performance improvements.
If the results of the performance tuning aren't fully satisfactory, I will likely make the new feature opt-in via the DSL.
Hi, we are facing this issue too, we were thinking about increasing our xmx from 4G to 6G.
I've tried out -Pdependency.analysis.check-binary-compat=false, it seems to make matters a little better, but I still managed to provoke oom's.
Unfortunately I'm unable to share a heap dump since the project is proprietary.
Maybe the output of gradle build -Pdependency.analysis.check-binary-compat=false will give at least a little insight?
Next I'll try to provoke it with 3.4.1
Gradle output
> Task :<project>:explodeCodeSourceTest
Exception in thread "Daemon client event forwarder" java.lang.OutOfMemoryError: Java heap space
at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:1668)
at java.base/java.util.concurrent.LinkedBlockingQueue.poll(LinkedBlockingQueue.java:460)
at org.gradle.launcher.daemon.server.exec.DaemonConnectionBackedEventConsumer$ForwardEvents.getNextEvent(DaemonConnectionBackedEventConsumer.java:72)
at org.gradle.launcher.daemon.server.exec.DaemonConnectionBackedEventConsumer$ForwardEvents.run(DaemonConnectionBackedEventConsumer.java:59)
> Task :<project>:computeActualUsageMain FAILED
> Task :<project>:synthesizeProjectViewTest
> Task :<project>:discoverDuplicationForCompileTest
> Task :<project>:explodeJarTest
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':<project>:computeActualUsageMain'.
> A failure occurred while executing com.autonomousapps.tasks.ComputeUsagesTask$ComputeUsagesAction
> Could not create an instance of type com.autonomousapps.tasks.ComputeUsagesTask$ComputeUsagesAction.
> Java heap space
@autonomousapps Yes using the -Pdependency.analysis.check-binary-compat=false it works. I would be glad to support but unfortunately I am not allowed to share a heap dump.
@patrick-dedication and @akoufa I would be more than happy to sign an NDA if that would help work around the problem of sharing a proprietary heap dump. If this is something you might want to pursue, feel free to contact me using the contact info on my profile page, and we can discuss further.
Unfortunately, a stacktrace isn't sufficient here. I might in parallel work on a DSL that changes this new feature to opt-in to unblock your upgrades, but I would really like to see what's causing the increased heap usage if at all possible.
@autonomousapps quick update I've now captured a few heap dumps, with and without binary compat enabled. I'm currently running them through MAT to see if I can spot something. If you have any specific questions I'm happy to run the analysis and share the output. As far as NDA goes I don't even know who to ask...
Seems to fail at synthesizeDependenciesIntegrationTest
Task :
:explodeJarMain FAILURE: Build failed with an exception.
- What went wrong:
Execution failed for task ':
:synthesizeDependenciesIntegrationTest'.
A failure occurred while executing com.autonomousapps.tasks.SynthesizeDependenciesTask$SynthesizeDependenciesWorkAction Java heap space
With my limited knowledge MAT doesn't really show anything obvious. One thing I noted is that there is always 1G of unreachable objects, which might be ok since they will be GCed eventually. I didn't have a deep look at the code, but might it be that findReflectiveAccesses creates to much memory pressure due to the nested loops? As far as I understand I have 54 explodedJars.
Leak suspect analysis finds DefaultExternalModuleComponentGraphResolveState as top suspect taking up 23% (700M) of heap
This is interesting but unlikely to be related to the changes in 3.5.0.* I wonder if this is more of a death by 1000 cuts situation?
*is that 700M only present in the failing case?
As far as NDA goes I don't even know who to ask...
I'd start with your manager :D
Hi, I just had time to look at the dump without check binary compat enabled. This dump also shows 700MB for DefaultExternalModuleComponentGraphResolveState. I think that this is just the straw on the camels back. Let me try to get a dump with version 3.4.1. We updated to gradle 9 almost simultaneously, so this might be a regression in our setup or gradle too.
I'll try to work around the red tape :)
I just saw this comment. I feel like the Github UI was hiding it yesterday?
I also suspect too much object creation, particularly with the nested loops as you suggest. That algorithm... I wrote it to solve the problem, knowing it wasn't optimal, with the hope I could return to it later. I'll see if I can do anything about it, especially since I have good test coverage.
Quick update we have been running with v3.4.1 for almost 24hours without a repro. Since turning binary compat off doesn't seem to help a lot, I'd assume it's a change made in 3.5.0 which is outside of the path triggered by binary compat.
Since turning binary compat off doesn't seem to help a lot
Oh I'm confused. Several of the comments above said that the OOM issue went away when disabling the binary compat check. Is that not the case?
For me that is not the case :) See https://github.com/autonomousapps/dependency-analysis-gradle-plugin/issues/1604#issuecomment-3565839169 I reproduced it more than once with 3.5.0 with binary compat disabled.
Also had a try with -Ddependency.analysis.cache.max=0 because I saw the cache takes up >100MB in the dumps.
Didn't think that would fix it, but wanted to try it anyways, it didn't fix it.
Thinking what could be the next experiment.
Re-enabled cache disabled binary-compat again, just to make sure we still get an OOM.
Reproduced it in the first pipeline with -Pdependency.analysis.check-binary-compat=false