spark-solr icon indicating copy to clipboard operation
spark-solr copied to clipboard

Spark 2.1.0 with spark-solr-3.6.0.jar

Open ramjia2z opened this issue 6 years ago • 1 comments

I followed from https://github.com/lucidworks/spark-solr#via-dataframe

spark-shell --jars /$Path_to_Solr_Jar/spark-solr-3.6.0.jar

When I try val df = spark.read.format("solr").options(options).load

I get the exception below. Please advise

java.lang.ClassNotFoundException: Failed to find data source: solr. Please find packages at http://spark.apache.org/third-party-projects.html at org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:569) at org.apache.spark.sql.execution.datasources.DataSource.providingClass$lzycompute(DataSource.scala:86) at org.apache.spark.sql.execution.datasources.DataSource.providingClass(DataSource.scala:86) at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:325) at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:152) at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:125) ... 48 elided Caused by: java.lang.ClassNotFoundException: solr.DefaultSource at scala.reflect.internal.util.AbstractFileClassLoader.findClass(AbstractFileClassLoader.scala:62) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$25$$anonfun$apply$13.apply(DataSource.scala:554) at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$25$$anonfun$apply$13.apply(DataSource.scala:554) at scala.util.Try$.apply(Try.scala:192) at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$25.apply(DataSource.scala:554) at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$25.apply(DataSource.scala:554) at scala.util.Try.orElse(Try.scala:84) at org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:554) ... 53 more

ramjia2z avatar Feb 22 '19 09:02 ramjia2z

make sure you jar is not corrupted and include all the dependencies. Add the jar via packages https://github.com/lucidworks/spark-solr#maven-central

kiranchitturi avatar Mar 15 '20 05:03 kiranchitturi