Class conflict in ORC benchmark
I ran ORC data benchmark (master & spark3.2) and got this error:
Caused by: org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 1 times, most recent failure: Lost task 0.0 in stage 0.0 (TID 0) (localhost executor driver): java.lang.NoSuchMethodError: 'org.apache.orc.storage.ql.exec.vector.VectorizedRowBatch org.apache.orc.TypeDescription.createRowBatch(int)'
Seems like class conflict, cause iceberg-orc uses orc-core-nohive and Spark uses orc-core.
This issue has been automatically marked as stale because it has been open for 180 days with no activity. It will be closed in next 14 days if no further activity occurs. To permanently prevent this issue from being considered stale, add the label 'not-stale', but commenting on the issue is preferred when possible.
This issue has been closed because it has not received any activity in the last 14 days since being marked as 'stale'
@zhongyujiang Did you solve that problem? I had the same problem
@V-yg No, I haven't.
I have the same issue trying to write Iceberg ORC files using the JAVA API with hive metastore as a catalog. Somehow, it uploads this class: org.apache.orc.storage.ql.exec.vector.VectorizedRowBatch (from hive-metastore-api) Instead of this class: org.apache.orc.storage.ql.exec.vector.VectorizedRowBatch Any suggestion how to solve it, will be very appreciated :)