spark-hbase-connector icon indicating copy to clipboard operation
spark-hbase-connector copied to clipboard

Class not found exception

Open guptanitin15 opened this issue 9 years ago • 5 comments

Hi there,

while running spark hbase application, it gives below exception

xception in thread "main" java.lang.NoClassDefFoundError: it/nerdammer/spark/hbase/package$ at SparkHBaseExample$.main(SparkHBaseExample.scala:29) at SparkHBaseExample.main(SparkHBaseExample.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) Caused by: java.lang.ClassNotFoundException: it.nerdammer.spark.hbase.package$ at java.net.URLClassLoader$1.run(URLClassLoader.java:366) at java.net.URLClassLoader$1.run(URLClassLoader.java:355) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:354) at java.lang.ClassLoader.loadClass(ClassLoader.java:425) at java.lang.ClassLoader.loadClass(ClassLoader.java:358)

I have imported import it.nerdammer.spark.hbase._ in my scala code, also added libraryDependencies += "it.nerdammer.bigdata" % "spark-hbase-connector_2.10" % "1.0.3" to my build.sbt

guptanitin15 avatar Jul 05 '16 05:07 guptanitin15

It seems a project configuration error, as once you import the lib, it should be in the classpath.

Maybe your IDE is causing the problem. Try to compile and launch the application through sbt to see if the problem persists.

nicolaferraro avatar Jul 05 '16 08:07 nicolaferraro

Hi I am getting the same error.. sbt based compilation went through successfully. Please help solve this

16/08/05 06:52:31 ERROR Executor: Exception in task 0.0 in stage 3.0 (TID 71) java.lang.NoClassDefFoundError: it/nerdammer/spark/hbase/package$ at testPartition$$anonfun$main$1$$anonfun$apply$1.apply(testPartition.scala:33) at testPartition$$anonfun$main$1$$anonfun$apply$1.apply(testPartition.scala:29) at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$33.apply(RDD.scala:920) at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$33.apply(RDD.scala:920) at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1858) at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1858) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66) at org.apache.spark.scheduler.Task.run(Task.scala:89) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745) Caused by: java.lang.ClassNotFoundException: it.nerdammer.spark.hbase.package$ at java.net.URLClassLoader$1.run(URLClassLoader.java:366) at java.net.URLClassLoader$1.run(URLClassLoader.java:355) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:354) at java.lang.ClassLoader.loadClass(ClassLoader.java:425) at java.lang.ClassLoader.loadClass(ClassLoader.java:358)

balaji-krishnan72 avatar Aug 05 '16 11:08 balaji-krishnan72

This seems to happen when the application is running. Can you provide details on the launch environment and the scripts/arguments you used?

nicolaferraro avatar Aug 05 '16 12:08 nicolaferraro

@balaji-krishnan72 you may bundle all libs (except spark-core) into your result jar. I've the same issue today. This may be done by sbt-assembly plugin, ex.:

In project/plugins.sbt add:

addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.14.3")

And in build.sbt:

//....
scalaVersion := "2.11.8"

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.6.0" % "provided"

libraryDependencies += "it.nerdammer.bigdata" % "spark-hbase-connector_2.10" % "1.0.3"

mergeStrategy in assembly <<= (mergeStrategy in assembly) { (old) =>
{
  case PathList("javax", "servlet", xs @ _*) => MergeStrategy.last
  case PathList("javax", "activation", xs @ _*) => MergeStrategy.last
  case PathList("org", "apache", xs @ _*) => MergeStrategy.last
  case PathList("com", "google", xs @ _*) => MergeStrategy.last
  case PathList("com", "esotericsoftware", xs @ _*) => MergeStrategy.last
  case PathList("com", "codahale", xs @ _*) => MergeStrategy.last
  case PathList("com", "yammer", xs @ _*) => MergeStrategy.last
  case "about.html" => MergeStrategy.rename
  case "META-INF/ECLIPSEF.RSA" => MergeStrategy.last
  case "META-INF/mailcap" => MergeStrategy.last
  case "META-INF/mimetypes.default" => MergeStrategy.last
  case "plugin.properties" => MergeStrategy.last
  case "log4j.properties" => MergeStrategy.last
  case x => old(x)
}
}
//....

Then pack you jar by sbt assembly

theres avatar Aug 18 '16 09:08 theres

I'm having the same issue. Compiles fine, then gives the error when run via spark-submit. Nothing passed into spark-submit besides the jar file:

spark-submit /path/to/jar/application.jar

mstrofbass avatar Nov 27 '17 17:11 mstrofbass