mleap
mleap copied to clipboard
java.lang.NoSuchMethodError: scala.Product.$init$(Lscala/Product;)
using scala version 2.12.10 and started it as spark-shell --packages ml.combust.mleap:mleap-spark-extension_2.12:0.15.0
running the export demo on https://mleap-docs.combust.ml/demos/
scala> val transform = pipeline.transform(dataframe)
transform: org.apache.spark.sql.DataFrame = [test_string: string, test_double: double ... 2 more fields]
scala> val sbc = SparkBundleContext().withDataset(transform)
java.lang.NoSuchMethodError: scala.Product.$init$(Lscala/Product;)V
at ml.combust.bundle.BundleRegistry.<init>(BundleRegistry.scala:73)
at ml.combust.bundle.BundleRegistry$.apply(BundleRegistry.scala:73)
at ml.combust.bundle.BundleRegistry$.apply(BundleRegistry.scala:44)
at ml.combust.bundle.BundleRegistry$.apply(BundleRegistry.scala:28)
at org.apache.spark.ml.bundle.SparkBundleContext$.apply(SparkBundleContext.scala:21)
at org.apache.spark.ml.bundle.SparkBundleContext$.apply(SparkBundleContext.scala:17)
... 57 elided
scala> spark.sparkContext.listJars.foreach(println)
spark://intul178723677:62567/jars/ml.combust.mleap_mleap-spark-base_2.12-0.15.0.jar
spark://intul178723677:62567/jars/ml.combust.mleap_mleap-base_2.12-0.15.0.jar
spark://intul178723677:62567/jars/ml.combust.mleap_mleap-spark-extension_2.12-0.15.0.jar
spark://intul178723677:62567/jars/com.thesamet.scalapb_lenses_2.12-0.7.0-test2.jar
spark://intul178723677:62567/jars/com.lihaoyi_fastparse_2.12-1.0.0.jar
spark://intul178723677:62567/jars/com.lihaoyi_fastparse-utils_2.12-1.0.0.jar
spark://intul178723677:62567/jars/com.typesafe_config-1.3.0.jar
spark://intul178723677:62567/jars/ml.combust.bundle_bundle-ml_2.12-0.15.0.jar
spark://intul178723677:62567/jars/com.google.protobuf_protobuf-java-3.5.1.jar
spark://intul178723677:62567/jars/ml.combust.mleap_mleap-spark_2.12-0.15.0.jar
spark://intul178723677:62567/jars/ml.combust.mleap_mleap-core_2.12-0.15.0.jar
spark://intul178723677:62567/jars/io.spray_spray-json_2.12-1.3.2.jar
spark://intul178723677:62567/jars/com.thesamet.scalapb_scalapb-runtime_2.12-0.7.1.jar
spark://intul178723677:62567/jars/com.jsuereth_scala-arm_2.12-2.0.jar
spark://intul178723677:62567/jars/com.github.rwl_jtransforms-2.4.0.jar
spark://intul178723677:62567/jars/org.scala-lang_scala-reflect-2.12.8.jar
spark://intul178723677:62567/jars/ml.combust.mleap_mleap-runtime_2.12-0.15.0.jar
spark://intul178723677:62567/jars/ml.combust.bundle_bundle-hdfs_2.12-0.15.0.jar
spark://intul178723677:62567/jars/com.lihaoyi_sourcecode_2.12-0.1.4.jar
spark://intul178723677:62567/jars/ml.combust.mleap_mleap-tensor_2.12-0.15.0.jar
spark://intul178723677:62567/jars/commons-io_commons-io-2.5.jar
I troubled by this problem too.Have you solved it?
Does it work with scala 2.11 ?
@felixgao @woneway just wanted to double check, which Spark version are you trying this with? Looking at https://spark.apache.org/releases/spark-release-2-4-0.html, it would need to be spark 2.4.0 or later.
If that's the case, can I please ask you to try with 0.16.0-SNAPSHOT, we've recently updated from 2.12.8 to 2.12.10 in this version. Thank you!
I am setting up a project with below dependencies on Spline with Spark
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-sql_2.11 -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>2.4.5</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-streaming_2.11 -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql-kafka-0-10_2.11</artifactId>
<version>2.4.5</version>
</dependency>
Above is how my pom.xml look like.
The code is changed to add below lines
import za.co.absa.spline.harvester.SparkLineageInitializer;
SparkLineageInitializer.SparkSessionWrapper sessionWrapper = new SparkLineageInitializer.SparkSessionWrapper(session); sessionWrapper.enableLineageTracking(sessionWrapper.enableLineageTracking$default$1());
I am submitting the spark code but getting the error
Exception in thread "main" java.lang.NoClassDefFoundError: scala/reflect/io/JavaToolsPlatformArchive
at scala.tools.util.PathResolverBase.Calculated$lzycompute(PathResolver.scala:238)
at scala.tools.util.PathResolverBase.Calculated(PathResolver.scala:238)
at scala.tools.util.PathResolverBase.containers(PathResolver.scala:309)
at scala.tools.util.PathResolver.computeResult(PathResolver.scala:341)
at scala.tools.util.PathResolver.computeResult(PathResolver.scala:332)
at scala.tools.util.PathResolverBase.result(PathResolver.scala:314)
at scala.tools.nsc.backend.JavaPlatform$class.classPath(JavaPlatform.scala:28)
at scala.tools.nsc.Global$GlobalPlatform.classPath(Global.scala:115)
at scala.tools.nsc.Global.scala$tools$nsc$Global$$recursiveClassPath(Global.scala:131)
at scala.tools.nsc.Global.classPath(Global.scala:128)
at scala.tools.nsc.backend.jvm.BTypesFromSymbols.
Process finished with exit code 1
Is spline not compatible with spark 2.4.5 version? If that's the case, could anyone suggest on the correct dependencies to be added?
Thanks in advance!