Bump Kotlin versions to 2.0.0-Beta4
- Bumps KGP to 2.0.0-Beta4
- Bumps kotlin-compiler to 2.0.0-Beta4 (only K1)
Some context on the konanDistribution hack
The solution in #3147 uses an internal property from KGP with @Suppress("INVISIBLE_MEMBER").
In KGP <= 1.9.20 the type of this property is String (Project.konanHome: String), so everything works as expected.
In KGP 2.0.0 the type of the property was changed to File: https://github.com/JetBrains/kotlin/commit/7165d15d59f5c01083b8ad9a37f3da7799c5569d#diff-c0ac64e93aa7f4c033b27d586757c6493f31699ca28d0b5873a60d9693fefa2cL35-R38
This property is accessed only if the version of KGP in the user project is <= 1.9.20. However, if we compile Dokka 2.0.0 against KGP 2.0.0 (in which konanHome has a different type), and run it with KGP <=1.9.20, we'll get
java.lang.NoSuchMethodError: org.jetbrains.kotlin.compilerRunner.NativeToolRunnersKt.getKonanHome(Lorg/gradle/api/Project;)Ljava/io/File;
at org.jetbrains.dokka.gradle.kotlin.KotlinNativeDistributionAccessor.<init>(KotlinNativeDistributionAccessor.kt:31)
Because Dokka was compiled against konanHome: File (2.0.0), but only konanHome: String (1.9.20) is present.
This hack adds reflection access to the expected 1.9.20 property (String). The code for alternativeKonanHome is taken from #3145 as a backup option - everything seems to work without it, but you never know.
I've also added a project property org.jetbrains.dokka.classpath.excludeKonanPlatformDependencyFiles in case both solutions return null and start failing user builds. If some user is unable to update KGP to 2.0.0 to fix this problem, but for some reason needs to update the version of Dokka to 2.0.0 - they will be able to generate documentation, albeit incomplete.
The integration tests are failing with
Execution failed for task ':commonizeNativeDistribution'.
> Could not resolve all files for configuration ':kotlinNativeBundleConfiguration'.
> Could not find org.jetbrains.kotlin:kotlin-native-prebuilt:2.0.0-Beta4.
At first I thought that it was an incompatible change in Kotlin / KGP (https://youtrack.jetbrains.com/issue/KT-64181), but after manually checking it, I realized it could be specific to our integration tests, and indeed this configuration might be causing the failure:
https://github.com/Kotlin/dokka/blob/d9f1efe6d2bad585eba5de0b85d1ac2d9bd4c6ab/dokka-integration-tests/gradle/projects/template.settings.gradle.kts#L102-L130
@adam-enko could you please help here? It's a recent change from #3492, so hopefully it's still fresh in memory
@adam-enko could you please help here? It's a recent change from #3492, so hopefully it's still fresh in memory
Sure thing, it's a simple fix.
Explanation of the situation:
- KGP 1.x automatically added a custom Ivy repo for downloading K/N artifacts.
template.settings.gradle.ktssetrepositoriesMode.set(RepositoriesMode.PREFER_SETTINGS)(which is a good thing, preventing Gradle plugins from adding random repositories is more secure/stable/predictable), which blocked KGP from automatically adding the Ivy repo, so instead it had to be specified manually.- In KGP 2.x the K/N artifacts are uploaded to Maven Central, so no custom repo is necessary.
- BUT the workaround I added used an
exclusiveContent {}filter (to help Gradle performance, so it wouldn't unnecessarily try to search the Ivy repo for non-K/N content) - So when KGP 2.0 tried to download K/N artifacts, Gradle would only look in the custom Ivy repo, which doesn't contain K/N 2.0 content.
I mad a commit to just remove the exclusiveContent {} filter, so Gradle will look for K/N artifacts in either the Ivy repo or Maven Central.
How important is the failing SequentialTasksExecutionStressTest?
https://teamcity.jetbrains.com/buildConfiguration/KotlinTools_Dokka_DokkaIntegrationTestsK2_virtual_Batch_1_1/4567095?buildTab=overview&hideTestsFromDependencies=false&hideProblemsFromDependencies=false&expandBuildTestsSection=true&expandBuildDeploymentsSection=false&expandBuildProblemsSection=true
How important is the failing SequentialTasksExecutionStressTest?
Theoretically it could mean, that there is some kind of memory leak. But also, this could just mean, that more memory is needed. After some short investigation (with profiler/memory graphs), I haven't found anything suspicious. So may be some more in-depth checks should be performed.
I think it's certainly worth investigating why the test is failing so consistently and for K2 only:
It's relatively stable with K1 and in the master branch.
As Oleg has pointed out, we've had instances where updating the Kotlin version led to memory leaks, but we've also had situations where the new Kotlin version required just a bit more memory than the previous one, in which case the allocated memory should be increased. Just have to rule memory leaks out as the result of the investigation, because if there's a problem somewhere in Kotlin - it's worth reporting it upstream as it's likely to affect other Kotlin users too
We'll try to take a look at it as part of this PR, so I guess we can halt the review
I've done some additional checks on how much memory is used, here are some graphs:
Test with K2 analysis built with Kotlin 2.0.0-RC1: current PR, this is what is failing on CI, works fine locally
Test with K2 analysis built with Kotlin 1.9.22: current master
Test with K1 analysis built with Kotlin 2.0.0-RC1: current PR, doesn't fail on CI
Test with K1 analysis built with Kotlin 1.9.22: current master
Note: integration test was run locally with the same Gradle (8.7) and KGP (2.0.0-RC1) versions.
Overall, all the graphs look fine, and most likely there is no memory leaks. Still, K2 analysis uses slightly more metaspace memory (±60 MB), that's why most likely at some point this test starts to fail, as we do reinitialize ClassLoader for Dokka execution in Gradle plugin for every task execution. While locally it works fine (no crash), on CI there could be less resources - and so at some point ClassLoader for Dokka execution can not be loaded, and so fails with OOM.
I would propose to increase -XX:MaxMetaspaceSize in this specific test from 400 to 500 and see what's going on.
If it works, then most likely there is no leaks.
Update: integration tests are green with MaxMetaspaceSize=500!
I've performed additional stress tests for this test (SequentialTasksExecutionStressTest) with increased amount of tasks (500 vs 100).
Here are some results:
Test with K2 analysis built with Kotlin 2.0.0-RC1: current PR
Test with K2 analysis built with Kotlin 1.9.22: current master
Test with K1 analysis built with Kotlin 2.0.0-RC1: current PR
Test with K1 analysis built with Kotlin 1.9.22: current master
Note: each test duration was ±30 minutes at my side. The heap is increasing in each test in a similar way: this PR behaves similar to master for both K1 and K2. So, I do think, that we can safely merge this PR.
Additionally this could mean two things:
- we do have some memory leak somewhere in Dokka (Engine or Gradle plugin)
- we don't have memory leaks and it's just normal behaviour for Gradle, as we do start 500 tasks.
For now, I don't have any proofs that we do have memory leak somewhere Dokka after some profiling (not just looking at memory graphs). Still, it's possible, as profiling Dokka execution in Gradle from multiple tasks is not that informative, as there is a lot of Gradle machinery involved under the hood...
Still, it's possible, as profiling Dokka execution in Gradle from multiple tasks is not that informative, as there is a lot of Gradle machinery involved under the hood...
I agree.
Just writing down what's on my mind, I don't have any action planned.:
This performance test also makes the CI tests slower, it shouldn't need to be run every time (e.g. scheduling it to be once a day should be fine), and it makes the CI tests much more sensitive to limited resources (I suspect without the performance test we could probably just run all of the tests on one machine, without limiting parallelism or needing to batch the tests.
I think it'd make sense to create a separate subproject for profiling tests.