Shixiong Zhu
Shixiong Zhu
I see. This looks like a bug. Which Spark version and Delta version are you using to generate the table?
Yep, we rely on Hive to read parquet files in the Hive connector. So we will need to wait for Hive 4 in order to solve this problem for Hive.
@spmp we are not using branches to support multiple Hive versions. Instead, we use a single jar to support both Hive 2 and Hive 3 right now. I guess this...
> I am happy to help bodging and documenting. I am pretty sure that simply deleting `HiveInputFormat.scala` as the upstream API has changed so much that the 'option' that is...
Cool. Have you tried that?
How did you upgrade hive? Maybe there is some issue in the dependency list.
We currently are using https://repo1.maven.org/maven2/org/apache/hive/hive-exec/3.1.2/hive-exec-3.1.2-core.jar But Hive 4 removed this jar. Not sure why SBT doesn't fail when `"org.apache.hive" % "hive-exec" % hiveVersion % "provided" classifier "core"` doesn't exist... You...
@MironAtHome , this is a Hive connector issue. It's not related to Delta Standalone. I looked at Hive code again and found we may be able to support this by...
As per our discussion in #323 , we will continue to investigate the Hive SerDe solution.
I think a better solution is just don't require users to provide the schema (#285)