morphir-elm
morphir-elm copied to clipboard
Spark tests fail if run with Java 18
Describe the bug
If the java version installed is java 18 (e.g. java -version
prints openjdk version "18.0.2.1" 2022-08-18
).
Then when gulp test
or mill spark.test
(from tests-integration) are run, the spark tests fail with the error
java.lang.IllegalAccessError: class org.apache.spark.storage.StorageUtils$ (in unnamed module @0x4d5650ae) cannot access class sun.nio.ch.DirectBuffer (in module java.base) because module java.base does not export sun.nio.ch to unnamed module @0x4d5650ae
This appears to be related to a change since Java 16 to enforce encapsulation. In Java 16, the command-line option --illegal-access=permit
must be explicitly included for this code to not fail, and from Java 17 onwards that option is removed and it always fails.
To Reproduce Steps to reproduce the behavior: 0. On a platform with java version 16+
- Run
gulp test
(it's quicker to go to the 'tests-integration' subdirectory and runmill spark.test
)
Expected behavior The tests are expected to pass without printing an error and stack trace.
Desktop (please complete the following information):
- OS: Debian GNU/Linux 11 (bullseye)
- Browser [e.g. chrome, safari]
- Version [e.g. 22]
Additional context I have tried to solve this by using newer versions of Spark. This appears to be done by editing 'tests-integration/build.sc' and specifying the 'spark-core' and 'spark-sql' versions to '3.3.0' instead of '3.2.1', but I still got that error with those versions set. It's possible that other imported libraries are also accessing parts of classes that they shouldn't.
I could solve it by specifying a different java runtime was being used.
On my debian-based system, java runtimes are installed into /usr/lib/jvm
, with a java 11 JRE at /usr/lib/jvm/java-11-openjdk-amd64
.
If I set the environment variable JAVA_HOME=/usr/lib/jvm/java-11-openjdk-amd64
then I force java 11 to be used instead of the default.