mleap icon indicating copy to clipboard operation
mleap copied to clipboard

java.lang.NoSuchMethodError: scala.Product.$init$(Lscala/Product;)

Open felixgao opened this issue 5 years ago • 4 comments

using scala version 2.12.10 and started it as spark-shell --packages ml.combust.mleap:mleap-spark-extension_2.12:0.15.0

running the export demo on https://mleap-docs.combust.ml/demos/

scala> val transform = pipeline.transform(dataframe)
transform: org.apache.spark.sql.DataFrame = [test_string: string, test_double: double ... 2 more fields]

scala> val sbc = SparkBundleContext().withDataset(transform)
java.lang.NoSuchMethodError: scala.Product.$init$(Lscala/Product;)V
  at ml.combust.bundle.BundleRegistry.<init>(BundleRegistry.scala:73)
  at ml.combust.bundle.BundleRegistry$.apply(BundleRegistry.scala:73)
  at ml.combust.bundle.BundleRegistry$.apply(BundleRegistry.scala:44)
  at ml.combust.bundle.BundleRegistry$.apply(BundleRegistry.scala:28)
  at org.apache.spark.ml.bundle.SparkBundleContext$.apply(SparkBundleContext.scala:21)
  at org.apache.spark.ml.bundle.SparkBundleContext$.apply(SparkBundleContext.scala:17)
  ... 57 elided
scala> spark.sparkContext.listJars.foreach(println)
spark://intul178723677:62567/jars/ml.combust.mleap_mleap-spark-base_2.12-0.15.0.jar
spark://intul178723677:62567/jars/ml.combust.mleap_mleap-base_2.12-0.15.0.jar
spark://intul178723677:62567/jars/ml.combust.mleap_mleap-spark-extension_2.12-0.15.0.jar
spark://intul178723677:62567/jars/com.thesamet.scalapb_lenses_2.12-0.7.0-test2.jar
spark://intul178723677:62567/jars/com.lihaoyi_fastparse_2.12-1.0.0.jar
spark://intul178723677:62567/jars/com.lihaoyi_fastparse-utils_2.12-1.0.0.jar
spark://intul178723677:62567/jars/com.typesafe_config-1.3.0.jar
spark://intul178723677:62567/jars/ml.combust.bundle_bundle-ml_2.12-0.15.0.jar
spark://intul178723677:62567/jars/com.google.protobuf_protobuf-java-3.5.1.jar
spark://intul178723677:62567/jars/ml.combust.mleap_mleap-spark_2.12-0.15.0.jar
spark://intul178723677:62567/jars/ml.combust.mleap_mleap-core_2.12-0.15.0.jar
spark://intul178723677:62567/jars/io.spray_spray-json_2.12-1.3.2.jar
spark://intul178723677:62567/jars/com.thesamet.scalapb_scalapb-runtime_2.12-0.7.1.jar
spark://intul178723677:62567/jars/com.jsuereth_scala-arm_2.12-2.0.jar
spark://intul178723677:62567/jars/com.github.rwl_jtransforms-2.4.0.jar
spark://intul178723677:62567/jars/org.scala-lang_scala-reflect-2.12.8.jar
spark://intul178723677:62567/jars/ml.combust.mleap_mleap-runtime_2.12-0.15.0.jar
spark://intul178723677:62567/jars/ml.combust.bundle_bundle-hdfs_2.12-0.15.0.jar
spark://intul178723677:62567/jars/com.lihaoyi_sourcecode_2.12-0.1.4.jar
spark://intul178723677:62567/jars/ml.combust.mleap_mleap-tensor_2.12-0.15.0.jar
spark://intul178723677:62567/jars/commons-io_commons-io-2.5.jar

felixgao avatar Feb 20 '20 01:02 felixgao

I troubled by this problem too.Have you solved it?

woneway avatar Mar 04 '20 07:03 woneway

Does it work with scala 2.11 ?

lucagiovagnoli avatar May 07 '20 16:05 lucagiovagnoli

@felixgao @woneway just wanted to double check, which Spark version are you trying this with? Looking at https://spark.apache.org/releases/spark-release-2-4-0.html, it would need to be spark 2.4.0 or later.

If that's the case, can I please ask you to try with 0.16.0-SNAPSHOT, we've recently updated from 2.12.8 to 2.12.10 in this version. Thank you!

ancasarb avatar May 18 '20 21:05 ancasarb

I am setting up a project with below dependencies on Spline with Spark <groupId>za.co.absa.spline.agent.spark</groupId> <artifactId>agent-core_2.11</artifactId> 0.5.1 <groupId>org.scala-lang</groupId> <artifactId>scala-library</artifactId> 2.11.8 <groupId>org.apache.spark</groupId> <artifactId>spark-core_2.11</artifactId> 2.4.5

    </dependency>
    <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-sql_2.11 -->
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-sql_2.11</artifactId>
        <version>2.4.5</version>

    </dependency>
    <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-streaming_2.11 -->
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-sql-kafka-0-10_2.11</artifactId>
        <version>2.4.5</version>
    </dependency>

Above is how my pom.xml look like.

The code is changed to add below lines

import za.co.absa.spline.harvester.SparkLineageInitializer;

SparkLineageInitializer.SparkSessionWrapper sessionWrapper = new SparkLineageInitializer.SparkSessionWrapper(session); sessionWrapper.enableLineageTracking(sessionWrapper.enableLineageTracking$default$1());

I am submitting the spark code but getting the error Exception in thread "main" java.lang.NoClassDefFoundError: scala/reflect/io/JavaToolsPlatformArchive at scala.tools.util.PathResolverBase.Calculated$lzycompute(PathResolver.scala:238) at scala.tools.util.PathResolverBase.Calculated(PathResolver.scala:238) at scala.tools.util.PathResolverBase.containers(PathResolver.scala:309) at scala.tools.util.PathResolver.computeResult(PathResolver.scala:341) at scala.tools.util.PathResolver.computeResult(PathResolver.scala:332) at scala.tools.util.PathResolverBase.result(PathResolver.scala:314) at scala.tools.nsc.backend.JavaPlatform$class.classPath(JavaPlatform.scala:28) at scala.tools.nsc.Global$GlobalPlatform.classPath(Global.scala:115) at scala.tools.nsc.Global.scala$tools$nsc$Global$$recursiveClassPath(Global.scala:131) at scala.tools.nsc.Global.classPath(Global.scala:128) at scala.tools.nsc.backend.jvm.BTypesFromSymbols.(BTypesFromSymbols.scala:39) at scala.tools.nsc.backend.jvm.BCodeIdiomatic.(BCodeIdiomatic.scala:24) at scala.tools.nsc.backend.jvm.BCodeHelpers.(BCodeHelpers.scala:23) at scala.tools.nsc.backend.jvm.BCodeSkelBuilder.(BCodeSkelBuilder.scala:25) at scala.tools.nsc.backend.jvm.BCodeBodyBuilder.(BCodeBodyBuilder.scala:25) at scala.tools.nsc.backend.jvm.BCodeSyncAndTry.(BCodeSyncAndTry.scala:21) at scala.tools.nsc.backend.jvm.GenBCode.(GenBCode.scala:47) at scala.tools.nsc.Global$genBCode$.(Global.scala:675) at scala.tools.nsc.Global.genBCode$lzycompute(Global.scala:671) at scala.tools.nsc.Global.genBCode(Global.scala:671) at scala.tools.nsc.backend.jvm.GenASM$JPlainBuilder.serialVUID(GenASM.scala:1240) at scala.tools.nsc.backend.jvm.GenASM$JPlainBuilder.genClass(GenASM.scala:1329) at scala.tools.nsc.backend.jvm.GenASM$AsmPhase.emitFor$1(GenASM.scala:198) at scala.tools.nsc.backend.jvm.GenASM$AsmPhase.run(GenASM.scala:204) at scala.tools.nsc.Global$Run.compileUnitsInternal(Global.scala:1528) at scala.tools.nsc.Global$Run.compileUnits(Global.scala:1513) at scala.tools.reflect.ToolBoxFactory$ToolBoxImpl$ToolBoxGlobal.wrapInPackageAndCompile(ToolBoxFactory.scala:197) at scala.tools.reflect.ToolBoxFactory$ToolBoxImpl$ToolBoxGlobal.compile(ToolBoxFactory.scala:252) at scala.tools.reflect.ToolBoxFactory$ToolBoxImpl$$anonfun$compile$2.apply(ToolBoxFactory.scala:429) at scala.tools.reflect.ToolBoxFactory$ToolBoxImpl$$anonfun$compile$2.apply(ToolBoxFactory.scala:422) at scala.tools.reflect.ToolBoxFactory$ToolBoxImpl$withCompilerApi$.liftedTree2$1(ToolBoxFactory.scala:355) at scala.tools.reflect.ToolBoxFactory$ToolBoxImpl$withCompilerApi$.apply(ToolBoxFactory.scala:355) at scala.tools.reflect.ToolBoxFactory$ToolBoxImpl.compile(ToolBoxFactory.scala:422) at za.co.absa.commons.reflect.ReflectionUtils$.compile(ReflectionUtils.scala:41) at za.co.absa.spline.harvester.json.HarvesterJsonSerDe$.(HarvesterJsonSerDe.scala:25) at za.co.absa.spline.harvester.json.HarvesterJsonSerDe$.(HarvesterJsonSerDe.scala) at za.co.absa.spline.harvester.dispatcher.HttpLineageDispatcher.send(HttpLineageDispatcher.scala:161) at za.co.absa.spline.harvester.QueryExecutionEventHandler$$anonfun$onSuccess$1.apply(QueryExecutionEventHandler.scala:43) at za.co.absa.spline.harvester.QueryExecutionEventHandler$$anonfun$onSuccess$1.apply(QueryExecutionEventHandler.scala:41) at scala.Option.foreach(Option.scala:257) at za.co.absa.spline.harvester.QueryExecutionEventHandler.onSuccess(QueryExecutionEventHandler.scala:41) at za.co.absa.spline.harvester.listener.SplineQueryExecutionListener$$anonfun$onSuccess$1.apply(SplineQueryExecutionListener.scala:37) at za.co.absa.spline.harvester.listener.SplineQueryExecutionListener$$anonfun$onSuccess$1.apply(SplineQueryExecutionListener.scala:37) at scala.Option.foreach(Option.scala:257) at za.co.absa.spline.harvester.listener.SplineQueryExecutionListener.onSuccess(SplineQueryExecutionListener.scala:37) at org.apache.spark.sql.util.ExecutionListenerManager$$anonfun$onSuccess$1$$anonfun$apply$mcV$sp$1.apply(QueryExecutionListener.scala:127) at org.apache.spark.sql.util.ExecutionListenerManager$$anonfun$onSuccess$1$$anonfun$apply$mcV$sp$1.apply(QueryExecutionListener.scala:126) at org.apache.spark.sql.util.ExecutionListenerManager$$anonfun$org$apache$spark$sql$util$ExecutionListenerManager$$withErrorHandling$1.apply(QueryExecutionListener.scala:148) at org.apache.spark.sql.util.ExecutionListenerManager$$anonfun$org$apache$spark$sql$util$ExecutionListenerManager$$withErrorHandling$1.apply(QueryExecutionListener.scala:146) at scala.collection.immutable.List.foreach(List.scala:381) at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:35) at scala.collection.mutable.ListBuffer.foreach(ListBuffer.scala:45) at org.apache.spark.sql.util.ExecutionListenerManager.org$apache$spark$sql$util$ExecutionListenerManager$$withErrorHandling(QueryExecutionListener.scala:146) at org.apache.spark.sql.util.ExecutionListenerManager$$anonfun$onSuccess$1.apply$mcV$sp(QueryExecutionListener.scala:126) at org.apache.spark.sql.util.ExecutionListenerManager$$anonfun$onSuccess$1.apply(QueryExecutionListener.scala:126) at org.apache.spark.sql.util.ExecutionListenerManager$$anonfun$onSuccess$1.apply(QueryExecutionListener.scala:126) at org.apache.spark.sql.util.ExecutionListenerManager.readLock(QueryExecutionListener.scala:159) at org.apache.spark.sql.util.ExecutionListenerManager.onSuccess(QueryExecutionListener.scala:125) at org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:678) at org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:285) at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:271) at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:229) at HousePriceSolution.main(HousePriceSolution.java:43) Caused by: java.lang.ClassNotFoundException: scala.reflect.io.JavaToolsPlatformArchive at java.net.URLClassLoader.findClass(URLClassLoader.java:382) at java.lang.ClassLoader.loadClass(ClassLoader.java:418) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352) at java.lang.ClassLoader.loadClass(ClassLoader.java:351) ... 63 more

Process finished with exit code 1

Is spline not compatible with spark 2.4.5 version? If that's the case, could anyone suggest on the correct dependencies to be added?

Thanks in advance!

nehapriyathakur avatar Nov 15 '20 04:11 nehapriyathakur