splash icon indicating copy to clipboard operation
splash copied to clipboard

java.lang.IllegalAccessError when running with Spark 2.4.0

Open amorskoy opened this issue 5 years ago • 4 comments

Before build I've changed spark version in pom.xml: Spark-submit done in local[*] mode

<spark.version>2.4.0</spark.version>

java.lang.IllegalAccessError: tried to access class org.apache.spark.shuffle.sort.ShuffleInMemorySorter from class org.apache.spark.shuffle.sort.SplashUnsafeSorter at org.apache.spark.shuffle.sort.SplashUnsafeSorter.(SplashUnsafeSorter.scala:77) at org.apache.spark.shuffle.sort.SplashUnsafeShuffleWriter.(SplashUnsafeShuffleWriter.scala:56) at org.apache.spark.shuffle.SplashShuffleManager.getWriter(SplashShuffleManager.scala:84) at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:98) at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55) at org.apache.spark.scheduler.Task.run(Task.scala:121) at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:402) at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:408) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748)

amorskoy avatar Oct 01 '19 09:10 amorskoy

This error looks strange. These classes live in the same package. I check the code of spark 2.4.0, ShuffleInMemorySorter is a class with package-level visibility. We should be able to access this class. I have also tried the tests with following command:

mvn package -Dspark.version=2.4.0

And it finishes successfully. Would you please try the same command and see if you have any issues accessing this class?

jealous avatar Oct 09 '19 10:10 jealous

Exception in thread "main" java.lang.IllegalAccessError: class org.apache.spark.storage.StorageUtils$ (in unnamed module @0x2a265ea9) cannot access class sun.nio.ch.DirectBuffer (in module java.base) because module java.base does not export sun.nio.ch to unnamed module @0x2a265ea9 at org.apache.spark.storage.StorageUtils$.(StorageUtils.scala:213) at org.apache.spark.storage.BlockManagerMasterEndpoint.(BlockManagerMasterEndpoint.scala:114) at org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:353) at org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:290) at org.apache.spark.SparkEnv$.create(SparkEnv.scala:339) at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:194) at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:279) at org.apache.spark.SparkContext.(SparkContext.scala:464) at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2704) at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:953) at scala.Option.getOrElse(Option.scala:201) at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:947) at SimpleApp$.main(SimpleApp.scala:6) at SimpleApp.main(SimpleApp.scala)

Above error is coming while running spark application

gmcaps avatar Dec 23 '22 07:12 gmcaps

@gmcaps - I am facing a similar issue as well, were you able to resolve this?

amareshb avatar Apr 07 '23 18:04 amareshb

@gmcaps I was able to solve the issue you mentioned by adding JAVA_HOME to path env variables.

amareshb avatar Apr 13 '23 17:04 amareshb