spark-excel
spark-excel copied to clipboard
Error while reading mounted xlsx: Could not initialize class shadeio.poi.xssf.model.SharedStringsTable
I am using Azure Databricks and I am trying to read an Excel file (xlsx) from a Storage account (ADLS Gen2). Because I get an 'Anonymous access' error when I connect to the file using the wasbs path I mounted it and tried to read the excel from there. This is my code:
`df = spark.read
.format("csv")
.option("header", "true")
.option("delimiter", ";")
.load("/mnt/mountPoint/Budget.csv")
df = spark.read
.format("com.crealytics.spark.excel")
.option("header", "true")
.option("sheetName", "Sheet1")
.load("/mnt/mountPoint/Budget.xls")
df = spark.read
.format("com.crealytics.spark.excel")
.option("header", "true")
.option("sheetName", "Sheet1")
.load("/mnt/mountPoint/Budget.xlsx") `
The first command succeeds and I get the headers from the file. A df.show() will show me the content. The second command (using the xls) succeeds as well and I get the schema and content. The third command fails with this error: java.lang.NoClassDefFoundError: Could not initialize class shadeio.poi.xssf.model.SharedStringsTable
I am using Databricks runtime 8.3 with Apache Spark 3.1.1 and Scala 2.12. What I have tried so far (all with the same error):
- Different version of the crealytics library. I tries 14.0, 13.7 and 13.6. All of them for scala 2.12
- The above code is in Python; I also tried it in scala
- I copied the content of the file (just the cells with data) to a new file and stored as xlsx and xls.
- Use different sheet names. The file has just one sheet named 'Sheet1'
This this the full stack trace. Any help is very much appreciated!'
`---------------------------------------------------------------------------
Py4JJavaError Traceback (most recent call last)
14 .format("com.crealytics.spark.excel")
15 .option("header", "true") \
/databricks/spark/python/pyspark/sql/readwriter.py in load(self, path, format, schema, **options) 202 self.options(**options) 203 if isinstance(path, str): --> 204 return self._df(self._jreader.load(path)) 205 elif path is not None: 206 if type(path) != list:
/databricks/spark/python/lib/py4j-0.10.9-src.zip/py4j/java_gateway.py in call(self, *args) 1302 1303 answer = self.gateway_client.send_command(command) -> 1304 return_value = get_return_value( 1305 answer, self.gateway_client, self.target_id, self.name) 1306
/databricks/spark/python/pyspark/sql/utils.py in deco(*a, **kw) 115 def deco(*a, **kw): 116 try: --> 117 return f(*a, **kw) 118 except py4j.protocol.Py4JJavaError as e: 119 converted = convert_exception(e.java_exception)
/databricks/spark/python/lib/py4j-0.10.9-src.zip/py4j/protocol.py in get_return_value(answer, gateway_client, target_id, name) 324 value = OUTPUT_CONVERTER[type](answer[2:], gateway_client) 325 if answer[1] == REFERENCE_TYPE: --> 326 raise Py4JJavaError( 327 "An error occurred while calling {0}{1}{2}.\n". 328 format(target_id, ".", name), value)
Py4JJavaError: An error occurred while calling o714.load.
: java.lang.NoClassDefFoundError: Could not initialize class shadeio.poi.xssf.model.SharedStringsTable
at shadeio.poi.ooxml.POIXMLFactory.createDocumentPart(POIXMLFactory.java:61)
at shadeio.poi.ooxml.POIXMLDocumentPart.read(POIXMLDocumentPart.java:684)
at shadeio.poi.ooxml.POIXMLDocument.load(POIXMLDocument.java:180)
at shadeio.poi.xssf.usermodel.XSSFWorkbook.
Hi guys, any update on this error? I have the same issue
Hi @thijsnijhuis and @udossa
- Could you please try again with the format from: "com.crealytics.spark.excel" -> "excel"?
.format("excel")
- And, please help take a look for list of dependencies for spark-excel to work. This wiki might has some useful idea
Credit to #133 Apache commons dependency issue by @jakeatmsft and @fwani solution
@quanghgx , thanks for your reply. I have changed it but now I simply get this eror: java.lang.ClassNotFoundException: Failed to find data source: excel. Please find packages at http://spark.apache.org/third-party-projects.html
I will need to take a look at the wiki link later on. Thanks!
@thijsnijhuis
I think, you should add a dependency for excel
that is com.crealytics:spark-excel_2.12
with specific version, first.
(because the error is java.lang.ClassNotFoundException: Failed to find data source: excel
)
https://github.com/crealytics/spark-excel#linking
Please try and change the library installation to Maven, that resolved my issue.