Very high augmentation times with Quarkus 3.27 with google libraries
Describe the bug
Quarkus is used in Keycloak . And Quarkus 3.27 version is used in Keycloak 26.4.x . This version shows very high augmentation times never noticed before
Here is a comparison
In this case with only com.google.cloud:google-cloud-pubsub include as a dependency server bootup times are as follows
start - 2025-11-19T04:44:07.515633012Z end - 2025-11-19T04:45:08.576030679Z
And Quarkus augmentation completed in 54652ms
Without this library included
start - 2025-11-19T04:22:18.029449006Z end - 2025-11-19T04:22:01.033748387Z
And Quarkus augmentation completed in 10788ms
This was tried with the pubsub library version 1.137.0 and even the latest versions of the library
The original reported issue is here https://github.com/keycloak/keycloak/issues/44310
Expected behavior
Quarkus augmentation should return to optimal levels with google libraries included and boot up time should reduce
Actual behavior
Extremely slow augmentation times
How to Reproduce?
The original reported issue is here
https://github.com/keycloak/keycloak/issues/44310
Output of uname -a or ver
No response
Output of java -version
21.0.9
Quarkus version or git rev
3.27
Build tool (ie. output of mvnw --version or gradlew --version)
No response
Additional information
No response
/cc @Sanne (core), @aloubyansky (core), @gsmet (core), @radcortez (core)
I can not reproduce this in a simple Quarkus application
@geoand just in case, have you tried re-augmenting a mutable-jar? I think this is how where it got noticed.
@vinod-afero could you provide a detailed way to reproduce this? Otherwise we will be shooting in the dark. Thanks!
@geoand just in case, have you tried re-augmenting a
mutable-jar? I think this is how where it got noticed.
That's correct
I can not reproduce this in a simple Quarkus application
I can not reproduce this in a simple Quarkus application
@vinod-afero could you provide a detailed way to reproduce this? Otherwise we will be shooting in the dark. Thanks!
To re clarify , this is part of a keycloak implementation . And the latest version of keycloak (26.4.x) has upgraded quarkus to 3.27 . This issue was not noticed in prior versions of keycloak ( For example Keycloak 26.3.x which use quarkus 3.20.3) as mentioned here in the original issue - https://github.com/keycloak/keycloak/issues/44310.
I can try to elaborate a bit . We have some custom SPIs ,one of which uses google pubsub libraries . And this was observed with only this SPI deployed . The way to probably reproduce this would be
- Get the latest version of keycloak - 26.4.5 as we speak
- Implement & deploy a custom keycloak SPI module with the following runtime gradle dependency and deploy it .( We deploy it here as a fat jar )
dependencies {
implementation "com.google.cloud:google-cloud-pubsub:${pubSubVersion}"
}
This gives the following gradle runtime dependency tree for this module .
\--- com.google.cloud:google-cloud-pubsub:1.137.0
+--- io.grpc:grpc-api:1.69.0
+--- io.grpc:grpc-stub:1.69.0
+--- io.grpc:grpc-protobuf:1.69.0
+--- io.grpc:grpc-protobuf-lite:1.69.0
+--- com.google.api:api-common:2.44.0
+--- com.google.j2objc:j2objc-annotations:3.0.0
+--- com.google.protobuf:protobuf-java:3.25.5
+--- com.google.api.grpc:proto-google-common-protos:2.52.0
+--- com.google.auth:google-auth-library-oauth2-http:1.32.1
+--- com.google.auth:google-auth-library-credentials:1.32.1
+--- com.google.http-client:google-http-client-gson:1.46.1
+--- com.google.api.grpc:proto-google-cloud-pubsub-v1:1.119.0
| +--- com.google.auto.value:auto-value-annotations:1.11.0
| +--- com.google.code.findbugs:jsr305:3.0.2
| +--- javax.annotation:javax.annotation-api:1.3.2
| +--- com.google.errorprone:error_prone_annotations:2.36.0
| +--- com.google.j2objc:j2objc-annotations:3.0.0
| +--- com.google.guava:failureaccess:1.0.2
| +--- com.google.guava:listenablefuture:9999.0-empty-to-avoid-conflict-with-guava
| \--- org.checkerframework:checker-qual:3.49.0
+--- com.google.api.grpc:proto-google-iam-v1:1.47.0
+--- com.google.guava:guava:33.4.0-jre
+--- com.google.guava:failureaccess:1.0.2
+--- com.google.guava:listenablefuture:9999.0-empty-to-avoid-conflict-with-guava
+--- org.checkerframework:checker-qual:3.49.0
+--- com.google.api:gax:2.61.0
+--- com.google.api:gax-grpc:2.61.0
+--- io.grpc:grpc-inprocess:1.69.0
+--- io.grpc:grpc-core:1.69.0
+--- com.google.android:annotations:4.1.1.4
+--- org.codehaus.mojo:animal-sniffer-annotations:1.24
+--- io.grpc:grpc-alts:1.69.0
+--- io.grpc:grpc-grpclb:1.69.0
+--- org.conscrypt:conscrypt-openjdk-uber:2.5.2
+--- io.grpc:grpc-auth:1.69.0
+--- io.grpc:grpc-netty-shaded:1.69.0
+--- io.grpc:grpc-util:1.69.0
+--- io.perfmark:perfmark-api:0.27.0
+--- io.grpc:grpc-googleapis:1.69.0
+--- io.grpc:grpc-xds:1.69.0
+--- io.grpc:grpc-services:1.69.0
+--- com.google.re2j:re2j:1.7
+--- com.google.api:gax-httpjson:2.61.0
+--- com.google.code.gson:gson:2.12.1
+--- com.google.protobuf:protobuf-java-util:3.25.5
+--- org.threeten:threetenbp:1.7.0
+--- com.google.code.findbugs:jsr305:3.0.2
+--- io.opencensus:opencensus-api:0.31.1
+--- io.grpc:grpc-context:1.69.0
+--- com.google.errorprone:error_prone_annotations:2.36.0
+--- com.google.auto.value:auto-value-annotations:1.11.0
+--- com.google.http-client:google-http-client:1.46.1
+--- org.apache.httpcomponents:httpclient:4.5.14
+--- commons-codec:commons-codec:1.18.0
+--- org.apache.httpcomponents:httpcore:4.4.16
+--- io.opencensus:opencensus-contrib-http-util:0.31.1
+--- io.opentelemetry:opentelemetry-api:1.47.0
+--- io.opentelemetry:opentelemetry-context:1.47.0
+--- io.opentelemetry:opentelemetry-semconv:1.26.0-alpha
\--- javax.annotation:javax.annotation-api:1.3.2
Are you saying that the provider jar contains google-cloud-pubsub and all of its transitive dependencies? How large is that jar?
Are you saying that the provider jar contains google-cloud-pubsub and all of its transitive dependencies? How large is that jar?
Yes it was a fat jar with dependecies included ~ 40M . But even using a non fat dist results in the same slow augmenation times , infact slower
│ keycloak 2025-11-23 16:31:55,676 INFO [io.quarkus.deployment.QuarkusAugmentor] (main) Quarkus augmentation completed in 88406ms
and gave this warning
│ keycloak 2025-11-23 16:30:31,648 WARN [io.quarkus.arc.deployment.SplitPackageProcessor] (build-57) Detected a split package usage which is considered a bad practice and should be avoided. Following packages wer │
│ keycloak - "com.google.longrunning" found in [/opt/keycloak/lib/../providers/gax-grpc-2.61.0.jar, /opt/keycloak/lib/../providers/proto-google-common-protos-2.52.0.jar]
Keycloak 26.3.x which uses 3.20.3 quarkus.version doesnt have this issue with the same deployment
keycloak 2025-11-23 16:53:04,090 INFO [io.quarkus.deployment.QuarkusAugmentor] (main) Quarkus augmentation completed in 8541ms
@vinod-afero Could you pls try to collect the build metrics of your app, i.e. with -Dquarkus.debug.dump-build-metrics=true? See https://github.com/quarkusio/quarkus/blob/main/TROUBLESHOOTING.md#analysing-build-steps-execution-time for mor info about configuration.
You can also use https://github.com/mkouba/quarkus-build-report to generate a simple HTML report from the JSON file.
@mkouba : This the report from KC 26.3
This the report from KC 26.4 (the slow one) report.24.4.html
File names are wrong .. pl ignore
So apparently the problem is in io.quarkus.deployment.steps.ReflectiveHierarchyStep#build. This build step takes 101096 ms and if you look at the Build Steps Concurrent Execution Chart, you can see that it blocks any concurrent execution.
In order to indentify the real issue we'll need to profile this build step. But my guess would be that something registers too many ReflectiveHierarchyBuildItems for some complex class hierarchies.
FYI I've updated the quarkus-build-report to include the number of build items in the report.
The way to probably reproduce this would be
- Get the latest version of keycloak - 26.4.5 as we speak
- Implement & deploy a custom keycloak SPI module with the following runtime gradle dependency and deploy it .( We deploy it here as a fat jar )
@vinod-afero Could you pls try to prepare a simple reproducer app in a github repo? I have zero knowledge of keycloak 🤷.
I can not reproduce this in a simple Quarkus application
The way to probably reproduce this would be
- Get the latest version of keycloak - 26.4.5 as we speak
- Implement & deploy a custom keycloak SPI module with the following runtime gradle dependency and deploy it .( We deploy it here as a fat jar )
@vinod-afero Could you pls try to prepare a simple reproducer app in a github repo? I have zero knowledge of keycloak 🤷.
I think it needs a KC environment .Ref previous remark by geoand
@vinod-afero please provide us a zip containing a project reproducing the issue. Otherwise we are shooting in the dark. Thanks!
@gsmet : If I share a keycloak project would you be in a position to build & try it locally ? It needs some familiarity with keycloak ( and a keycloak environment )
Yes that + some instructions on how to build and we will be able to reproduce.
@vinod-afero @mkouba shouldn't similar conditions be created with on of the quarkus quickstarts and setting the quarkus.package.jar.user-providers-directory to the one with the uber jar?
Keycloak not really doing much beyond basic augmentation support.
@vinod-afero @mkouba shouldn't similar conditions be created with on of the quarkus quickstarts and setting the quarkus.package.jar.user-providers-directory to the one with the uber jar?
Keycloak not really doing much beyond basic augmentation support.
Well, we need to find the build step that produces the problematic ReflectiveHierarchyBuildItems. I have no idea what quarkus.package.jar.user-providers-directory does but if you find a way to reproduce the problem with a quickstart we can definitely take a look...
@mkouba @gsmet
Since io.quarkus.deployment.steps.ReflectiveHierarchyStep#build is the culprit could you point me as to what could potentially cause delays in this section .. I have a bit of code base so , narrowing it down would be useful ( I can give you a zip based on that ) .
So far I've tried commenting out some annotations & static blocks .
Since io.quarkus.deployment.steps.ReflectiveHierarchyStep#build is the culprit could you point me as to what could potentially cause delays in this section .. I have a bit of code base so , narrowing it down would be useful ( I can give you a zip based on that ) .
So far I've tried commenting out some annotations & static blocks .
You can try to run a profiler; a flamegraph would be very useful. You can take a look at https://github.com/quarkusio/quarkus/blob/main/TROUBLESHOOTING.md#profiling-application-dev-mode-with-async-profiler to see how to enable profiling in the dev mode with async-profiler. Alternatively, just use any other tool (such as VisualVM).
@vinod-afero BTW as mentioned in https://github.com/quarkusio/quarkus/issues/51120#issuecomment-3574190278 the report now includes the number of build items - so I would start here, i.e. what is the number of produced ReflectiveHierarchyBuildItems?
@vinod-afero BTW as mentioned in #51120 (comment) the report now includes the number of build items - so I would start here, i.e. what is the number of produced
ReflectiveHierarchyBuildItems?
@mkouba : I cant see the build items section ( i guess you might have not committed the template into your repo) ..
Here are the top results from the latest grab
{
"count": 12529,
"class": "io.quarkus.deployment.builditem.nativeimage.ReflectiveClassBuildItem"
},
{
"count": 1470,
"class": "io.quarkus.deployment.builditem.nativeimage.ReflectiveHierarchyBuildItem"
},
{
"count": 1207,
"class": "io.quarkus.deployment.builditem.GeneratedClassBuildItem"
},
}
"itemsCount": 16625,
@mkouba : I cant see the build items section ( i guess you might have not committed the template into your repo) ..
It's there - https://github.com/mkouba/quarkus-build-report/commit/49ef947b091bf331a488d75d9d0bbded465b74c4 - but you need to rebase/sync the repo.
Here are the top results from the latest grab
So there are 1470 ReflectiveHierarchyBuildItems. The number itself does not look that high but ofc if the hierarchy is complex it could take time.
I wonder if we should improve our build metrics to include some information about the producing build step etc.
@vinod4dev do you have keycloak preview features enabled, or http optimized serializers enabled specifically?
If so that also sets quarkus.rest.jackson.optimization.enable-reflection-free-serializers=true on the latest keycloak releases. Not sure if that's related here.
@mkouba : I cant see the build items section ( i guess you might have not committed the template into your repo) ..
It's there - mkouba/quarkus-build-report@49ef947 - but you need to rebase/sync the repo.
Ah, you're right. Wrong commit 🤦 .
@vinod-afero You can try to build your app with https://github.com/quarkusio/quarkus/pull/51396 and compare the number of produced ReflectiveHierarchyBuildItems, or even the top producers of this build item in the Dev UI.
@vinod4dev do you have keycloak preview features enabled, or http optimized serializers enabled specifically?
If so that also sets quarkus.rest.jackson.optimization.enable-reflection-free-serializers=true on the latest keycloak releases. Not sure if that's related here.
@shawkins : No preview features enabled . And using default options .
@vinod-afero You can try to build your app with #51396 and compare the number of produced
ReflectiveHierarchyBuildItems, or even the top producers of this build item in the Dev UI.
Aahh .. looks like I will have to make a custom build of KC 26 to try this out ?
@vinod-afero You can try to build your app with #51396 and compare the number of produced
ReflectiveHierarchyBuildItems, or even the top producers of this build item in the Dev UI.Aahh .. looks like I will have to make a custom build of KC 26 to try this out ?
I have no idea how the Keycloak is built 🤷