Data-Science-Extensions icon indicating copy to clipboard operation
Data-Science-Extensions copied to clipboard

java.lang.ClassNotFoundException: scala.Product$class

Open barel166 opened this issue 4 years ago • 4 comments

I'm running spark locally. Spark v3.0.1 with Scala 2.12.10. I've also made sure that my local version of scala is 2.12. Does the RestDataSource package need to be recompiled for Scala 2.12?

I ran: spark-shell --jars spark-datasource-rest_2.11-2.1.0-SNAPSHOT.jar --packages org.scalaj:scalaj-http_2.12:2.3.0

This is line that yields error:

scala> val weatherDF = spark.read.format("org.apache.dsext.spark.datasource.rest.RestDataSource").options(parmg).load(); java.lang.NoClassDefFoundError: scala/Product$class at org.apache.dsext.spark.datasource.rest.RESTRelation.(RestRelation.scala:41) at org.apache.dsext.spark.datasource.rest.RestDataSource.createRelation(RestDataSource.scala:42) at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:344) at org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:297) at org.apache.spark.sql.DataFrameReader.$anonfun$load$2(DataFrameReader.scala:286) at scala.Option.getOrElse(Option.scala:189) at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:286) at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:221) ... 47 elided Caused by: java.lang.ClassNotFoundException: scala.Product$class at java.base/java.net.URLClassLoader.findClass(URLClassLoader.java:471) at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:589) at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:522) ... 55 more

barel166 avatar Jan 06 '21 00:01 barel166

@barel166 Yes. This code base is not updated for latest spark and scala version for a while.

sourav-mazumder avatar Jan 06 '21 00:01 sourav-mazumder

@sourav-mazumder What is the Max Version of Spark / Scala that the code base is most compatible with ?

noodlemind avatar Feb 14 '21 05:02 noodlemind

With the following updates to pom.xml(in Data-Science-Extensions folder) and a rebuild, the plugin is working perfectly fine with Spark 3.0.1 (Scala 2.12.10), verified with the example provided by Sourav for spark-shell(scala):

  1.  <dependency>
     <groupId>org.scalatest</groupId>
     <artifactId>scalatest_${scala.binary.version}</artifactId>
     <version>**3.0.1**</version>
     <scope>test</scope>
    
  2. <dependency>
     <groupId>org.scalacheck</groupId>
     <artifactId>scalacheck_${scala.binary.version}</artifactId>
     <version>**1.15.2**</version> <!-- 1.13.0 appears incompatible with scalatest 2.2.6 -->
     <scope>test</scope>
    
  3. <!-- General project dependencies version -->
    

    <java.version>1.7</java.version> <scala.version>2.12.10</scala.version> <scala.binary.version>2.12</scala.binary.version>

    <maven.version>3.3.9</maven.version> <slf4j.version>1.7.16</slf4j.version> <log4j.version>1.2.17</log4j.version>

    <spark.version>3.0.1</spark.version>

  4. <profile>
    

    scala-2.12 !scala-2.10 <scala.version>2.12.10</scala.version> <scala.binary.version>2.12</scala.binary.version>

Attached is the POM(in Data-Science-Extensions folder) with full changes below pom.xml.txt

guruRockz avatar Feb 16 '21 16:02 guruRockz

Hmm, the pom.xml file doesnt seem to be working anymore

jonahkaplan1 avatar Sep 15 '21 00:09 jonahkaplan1