spark-solr icon indicating copy to clipboard operation
spark-solr copied to clipboard

Spark-solr 3.6.0 with spark 2.4.0 when added as a package to apache-toree scala kernel gives error

Open preetid1712 opened this issue 6 years ago • 1 comments

org.restlet.jee:org.restlet:2.3.0 not found error when added as packages in kernel.json of apache-toree scala kernel:::

-- artifact org.restlet.jee#org.restlet.ext.servlet;2.3.0!org.restlet.ext.servlet.jar:
      https://repo1.maven.org/maven2/org/restlet/jee/org.restlet.ext.servlet/2.3.0/org.restlet.ext.servlet-2.3.0.jar
    ==== spark-packages: tried
      http://dl.bintray.com/spark-packages/maven/org/restlet/jee/org.restlet.ext.servlet/2.3.0/org.restlet.ext.servlet-2.3.0.pom
      -- artifact org.restlet.jee#org.restlet.ext.servlet;2.3.0!org.restlet.ext.servlet.jar:
      http://dl.bintray.com/spark-packages/maven/org/restlet/jee/org.restlet.ext.servlet/2.3.0/org.restlet.ext.servlet-2.3.0.jar
        ::::::::::::::::::::::::::::::::::::::::::::::
        ::          UNRESOLVED DEPENDENCIES         ::
        ::::::::::::::::::::::::::::::::::::::::::::::
        :: org.restlet.jee#org.restlet;2.3.0: not found
        :: org.restlet.jee#org.restlet.ext.servlet;2.3.0: not found
        ::::::::::::::::::::::::::::::::::::::::::::::
:: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
Exception in thread "main" java.lang.RuntimeException: [unresolved dependency: org.restlet.jee#org.restlet;2.3.0: not found, unresolved dependency: org.restlet.jee#org.restlet.ext.servlet;2.3.0: not found]
    at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1306)
    at org.apache.spark.deploy.DependencyUtils$.resolveMavenDependencies(DependencyUtils.scala:54)
    at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:315)
    at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:143)
    at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
    at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)

And when added the spark-solr 3.6.0 shaded jar file gives following error:

./bin/spark-shell --jars /usr/local/spark/jars/spark-solr-3.6.0-shaded.jar
SLF4J: Class path contains multiple SLF4J providers.
SLF4J: Found provider [org.slf4j.log4j12.Log4j12ServiceProvider@505fc5a4]
SLF4J: Found provider [org.apache.logging.slf4j.SLF4JServiceProvider@5fbdfdcf]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual provider is of type [org.slf4j.log4j12.Log4j12ServiceProvider@505fc5a4]
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.hadoop.security.authentication.util.KerberosUtil (file:/usr/local/spark/jars/hadoop-auth-2.7.3.jar) to method sun.security.krb5.Config.getInstance()
WARNING: Please consider reporting this to the maintainers of org.apache.hadoop.security.authentication.util.KerberosUtil
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
19/01/25 18:14:57 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
19/01/25 18:14:58 INFO SecurityManager: Changing view acls to: root
19/01/25 18:14:58 INFO SecurityManager: Changing modify acls to: root
19/01/25 18:14:58 INFO SecurityManager: Changing view acls groups to: 
19/01/25 18:14:58 INFO SecurityManager: Changing modify acls groups to: 
19/01/25 18:14:58 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(root); groups with view permissions: Set(); users  with modify permissions: Set(root); groups with modify permissions: Set()
19/01/25 18:14:58 INFO SignalUtils: Registered signal handler for INT
Exception in thread "main" java.lang.NoSuchMethodError: scala.reflect.internal.settings.MutableSettings$SettingValue.valueSetByUser()Lscala/Option;
	at scala.tools.nsc.Global.<init>(Global.scala:361)
	at scala.tools.nsc.interpreter.IMain$$anon$1.<init>(IMain.scala:254)
	at scala.tools.nsc.interpreter.IMain.newCompiler(IMain.scala:254)
	at scala.tools.nsc.interpreter.IMain.<init>(IMain.scala:94)
	at scala.tools.nsc.interpreter.IMain.<init>(IMain.scala:120)
	at org.apache.spark.repl.SparkILoopInterpreter.<init>(SparkILoopInterpreter.scala:24)
	at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
	at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:488)
	at org.apache.spark.repl.SparkILoop.createInterpreter(SparkILoop.scala:61)
	at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$startup$1$1$$anonfun$apply$1.apply$mcV$sp(SparkILoop.scala:255)
	at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$startup$1$1$$anonfun$apply$1.apply(SparkILoop.scala:255)
	at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$startup$1$1$$anonfun$apply$1.apply(SparkILoop.scala:255)

preetid1712 avatar Jan 31 '19 16:01 preetid1712

you need additional repositories to get the restlet jars https://github.com/lucidworks/spark-solr/blob/master/pom.xml#L67. The shaded jar is including scala.reflect which it should not

kiranchitturi avatar Mar 15 '20 05:03 kiranchitturi